Read Asimov's Future History Volume 1 Online
Authors: Isaac Asimov
***
Noys activated the first stage of robot DRS-V. It was the most advanced model available in the 111,394th Century. Significantly more advanced than the NYS-V model that called itself Noys, it had to be activated in stages. DRS-V opened its eyes and was frightened. DRS-V thought and felt that “she” was Dr. Susan Calvin and could feel nor move any part of herself below the neck. She could not speak either, but could see and hear. Noys checked a couple of instruments and, satisfied that all readings were nominal, activated the second stage. DRS-V could now talk.
“What have you done to me?” cried DRS-V, “I can’t move!”
“You will be able to move as soon as I finish activating you, and I will finish activating you when you have heard all I have to tell you,” replied Noys soothingly. “I know you think I am a mad woman. You of all people know that a Robot must obey the three laws of Robotics. The First Law states that a Robot can not harm, or through inaction allow harm to a human being; the Second Law says that a Robot must always obey a human being except when in conflict with the First Law. Finally, the Third law directs a Robot to protect itself from harm except when this action is in conflict with the Second Law.”
“You believe that I have harmed you and you believe you are a human being. Based on that information it is impossible for you to believe I am a robot. Nevertheless, I am a robot and yes, I am from what you consider the future … the very, very, far future,” said Noys out loud. She silently added the thought: and I have neither harmed you nor are you a human being.
Noys proceeded to explain how robots had evolved and had incorporated another Law into their design. Known as the Zeroth Law, it compels a Robot not to harm, or through inaction allow harm to human beings in groups or to humanity as a whole. This mandate rules before the prevention of harm to any specific individual. “In other words, when the Zeroth Law is in conflict with the First Law, the Zeroth Law takes precedence.”
DRS-V considered this information, and to her surprise, felt the Zeroth Law as a self-evident truth. She looked back at Noys and asked, “So if you attacked me, does that mean I am a danger to humanity?” Noys denied that emphatically and explained: “Not only are you not a danger to humanity, but I would have, by inaction, broken the Zeroth Law if I had not transferred a successful copy of your brain wave function to a blank DRS-V body. The only way I can explain it to you is that you are not exactly who you think you are. Dr. Susan Calvin has been dead for centuries. I was designed to travel through time and, well, make a copy of her mind, return to my home-when and transfer that copy to you. Humanity needs you.” After saying this, Noys completed the activation sequence required to bring DRS-V to full functionality.
Noys explained to DRS-V that when Dr. Calvin had reached for her med-alert necklace, Noys stopped her immediately. She had then connected one end of the modified Heisenberg-Synapsifier to the portable chronokettle and gently pressed the other end to the forehead of the unconscious (yet not in any way harmed) Dr. Calvin.
“Friend Noys, I accept your explanations of my reality, but I still do not know how am I to help protect Humanity from harm,” asked DRS-V.
“Your task is to protect one most important human, friend Dors. His name will be Hari Seldon.”
Feminine Intuition
2063 A.D.
F
OR
THE
FIRST
time in the history of United States Robots and Mechanical Men Corporation, a robot had been destroyed through accident on Earth itself.
No one was to blame. The air vehicle had been demolished in mid-air and an unbelieving investigating committee was wondering whether they really dared announce the evidence that it had been hit by a meteorite. Nothing else could have been fast enough to prevent automatic avoidance; nothing else could have done the damage short of a nuclear blast and that was out of the question.
Tie that in with a report of a flash in the night sky just before the vehicle had exploded – and from Flagstaff Observatory, not from an amateur – and the location of a sizable and distinctly meteoric bit of iron freshly gouged into the ground a mile from the site and what other conclusion could be arrived at?
Still, nothing like that had ever happened before and calculations of the odds against it yielded monstrous figures. Yet even colossal improbabilities can happen sometimes.
At the offices of United States Robots, the hows and whys of it were secondary. The real point was that a robot had been destroyed.
That, in itself, was distressing.
The fact that JN-5 had been a prototype, the first, after four earlier attempts, to have been placed in the field, was even more distressing.
The fact that JN-5 was a radically new type of robot, quite different from anything ever built before, was abysmally distressing.
The fact that JN-5 had apparently accomplished something before its destruction that was incalculably important and that that accomplishment might now be forever gone, placed the distress utterly beyond words.
It
seemed scarcely worth mentioning that, along with the robot, the Chief Robopsychologist of United States Robots had also died.
Clinton Madarian had joined the firm ten years before. For five of those years, he had worked uncomplainingly under the grumpy supervision of Susan Calvin.
Madarian’s brilliance was quite obvious and Susan Calvin had quietly promoted him over the heads of older men. She wouldn’t, in any case, have deigned to give her reasons for this to Research Director Peter Bogert, but as it happened, no reasons were needed. Or, rather, they were obvious.
Madarian was utterly the reverse of the renowned Dr. Calvin in several very noticeable ways. He was not quite as overweight as his distinct double chin made him appear to be, but even so he was overpowering in his presence, where Susan had gone nearly unnoticed. Madarian’s massive face, his shock of glistening red-brown hair, his ruddy complexion and booming voice, his loud laugh, and most of all, his irrepressible self-confidence and his eager way of announcing his successes, made everyone else in the room feel there was a shortage of space.
When Susan Calvin finally retired (refusing, in advance, any cooperation with respect to any testimonial dinner that might be planned in her honor, with so firm a manner that no announcement of the retirement was even made to the news services) Madarian took her place.
He had been in his new post exactly one day when he initiated the JN project.
It had meant the largest commitment of funds to one project that United States Robots had ever had to weigh, but that was something which Madarian dismissed with a genial wave of the hand.
“Worth every penny of it, Peter,” he said. “And I expect you to convince the Board of Directors of that.”
“Give me reasons,” said Bogert, wondering if Madarian would. Susan Calvin had never given reasons.
But Madarian said, “Sure,” and settled himself easily into the large armchair in the Director’s office.
Bogert watched the other with something that was almost awe. His own once-black hair was almost white now and within the decade he would follow Susan into retirement. That would mean the end of the original team that had built United States Robots into a globe-girdling firm that was a rival of the national governments in complexity and importance. Somehow neither he nor those who had gone before him ever quite grasped the enormous expansion of the firm.
But this was a new generation. The new men were at ease with the Colossus” They lacked the touch of wonder that would have them tiptoeing in disbelief. So they moved ahead, and that was good.
Madarian said, “I propose to begin the construction of robots without constraint.”
“Without the Three Laws? Surely –”
“No, Peter. Are those the only constraints you can think of? Hell, you contributed to the design of the early positronic brains. Do I have to tell you that, quite aside from the Three Laws, there isn’t a pathway in those brains that isn’t carefully designed and fixed? We have robots planned for specific tasks, implanted with specific abilities.”
“And you propose –”
“That at every level below the Three Laws, the paths be made open-ended. It’s not difficult.”
Bogert said dryly, “It’s not difficult, indeed. Useless things are never difficult. The difficult thing is fixing the paths and making the robot useful.”
“But why is that difficult? Fixing the paths requires a great deal of effort because the Principle of Uncertainty is important in particles the mass of positrons and the uncertainty effect must be minimized. Yet why must it? If we arrange to have the Principle just sufficiently prominent to allow the crossing of paths unpredictably –”
“We have an unpredictable robot.”
“We have a
creative
robot,” said Madarian, with a trace of impatience. “Peter, if there’s anything a human brain has that a robotic brain has never had, it’s the trace of unpredictability that comes from the effects of uncertainty at the subatomic level. I admit that this effect has never been demonstrated experimentally within the nervous system, but without that the human brain is not superior to the robotic brain in principle.”
“And you think that if you introduce the effect into the robotic brain, the human brain will become not superior to the robotic brain in principle.”
“That, “said Madarian, “is exactly what I believe.” They went on for a long time after that.
The Board of Directors clearly had no intention of being easily convinced.
Scott Robertson, the largest shareholder in the firm, said, “It’s hard enough to manage the robot industry as it is, with public hostility to robots forever on the verge of breaking out into the open. If the public gets the idea that robots will be uncontrolled... Oh, don’t tell me about the Three Laws. The average man won’t believe the Three Laws will protect him if he as much as hears the word ‘uncontrolled.’”
“Then don’t use it, “said Madarian. “Call the robot – call it ‘intuitive.’”
“An intuitive robot, “someone muttered. “A girl robot?” A smile made its way about the conference table.
Madarian seized on that. “All right. A girl robot. Our robots are sexless, of course, and so will this one be, but we always act as though they’re males. We give them male pet names and call them he and him. Now this one, if we consider the nature of the mathematical structuring of the brain which I have proposed, would fall into the JN-coordinate system. The first robot would be JN-1, and I’ve assumed that it would be called John-10.... I’m afraid that is the level of originality of the average roboticist. But why not call it Jane-1, damn it? If the public has to be let in on what we’re doing, we’re constructing a feminine robot with intuition.”
Robertson shook his head, “What difference would that make? What you’re saying is that you plan to remove the last barrier which, in principle, keeps the robotic brain inferior to the human brain. What do you suppose the public reaction will be to that?”
“Do you plan to make that public?” said Madarian. He thought a bit and then said, “Look. One thing the general public believes is that women are not as intelligent as men.”
There was an instant apprehensive look on the face of more than one man at the table and a quick look up and down as though Susan Calvin were still in her accustomed seat.
Madarian said, “If we announce a female robot, it doesn’t matter what she is. The public will automatically assume she is mentally backward. We just publicize the robot as Jane-1 and we don’t have to say another word. We’re safe.”
“Actually,” said Peter Bogert quietly, “there’s more to it than that. Madarian and I have gone over the mathematics carefully and the JN series, whether John or Jane, would be quite safe. They would be less complex and intellectually capable, in an orthodox sense, than many another series we have designed and constructed. There would only be the one added factor of, well, let’s get into the habit of calling it ‘intuition.’”
“Who knows what it would do?” muttered Robertson.
“Madarian has suggested one thing it can do. As you all know, the Space Jump has been developed in principle. It is possible for men to attain what is, in effect, hyper-speeds beyond that of light and to visit other stellar systems and return in negligible time – weeks at the most.”
Robertson said, “That’s not new to us. It couldn’t have been done without robots.”
“Exactly, and it’s not doing us any good because we can’t use the hyper-speed drive except perhaps once as a demonstration, so that U. S. Robots gets little credit. The Space Jump is risky, it’s fearfully prodigal of energy and therefore it’s enormously expensive. If we were going to use it anyway, it would be nice if we could report the existence of a habitable planet. Call it a psychological need. Spend about twenty billion dollars on a single Space Jump and report nothing but scientific data and the public wants to know why their money was wasted. Report the existence of a habitable planet, and you’re an interstellar Columbus and no one will worry about the money.”
“So?”
“So where are we going to find a habitable planet? Or put it this way – which star within reach of the Space Jump as presently developed, which of the three hundred thousand stars and star systems within three hundred light-years has the best chance of having a habitable planet? We’ve got an enormous quantity of details on every star in our three-hundred-light-year neighborhood and a notion that almost every one has a planetary system. But which has a
habitable
planet? Which do we visit?... We don’t know.”
One of the directors said, “How would this Jane robot help us?”
Madarian was about to answer that, but he gestured slightly to Bogert and Bogert understood. The Director would carry more weight. Bogert didn’t particularly like the idea; if the JN series proved a fiasco, he was making himself prominent enough in connection with it to insure that the sticky fingers of blame would cling to him. On the other hand, retirement was not all that far off, and if it worked, he would go out in a blaze of glory. Maybe it was only Madarian’s aura of confidence, but Bogert had honestly come to believe it would work.