Mask (7 page)

Read Mask Online

Authors: C.C. Kelly

BOOK: Mask
5.37Mb size Format: txt, pdf, ePub

“Outstanding, this is great news,” Donahue responded.

“We were rather pleased as well,” Vincent said.

“So what does this mean?” Needly asked.

“It means,” General Donahue answered, “we have a green light here.  How long before we can deploy the Gammas?”

“I would say immediately, depending upon the success of your field testing.”

“Perfect, they are performing beyond specifications.  You’ll be awarded the People’s Cross for this, secretly of course, but still quite an honor,” Donahue said through his grim smile.

“Well, good sir, let us not get carried away with our success just yet.  I am in no need of personal honors or commendations.  However, I do encourage the good congressman here to ensure that our, ah – compensation remains unfettered.”  He smiled broadly to Congressman Needly.

Congressman Needly smiled back, enthused and then his face changed, denoting that he had had a thought.  “What about military applications?  I could save a lot of my constituency’s families if we could deploy the Gammas instead of soldiers.  I lose over a thousand voters every month, a thousand!  I think we should make news of the Gammas public.”

“No, I am afraid that would be most unseemly and not do at all.  The automated drone misfires over Houston are still far too fresh in people’s consciousness.  The public is not ready for the Gamma Series, Betas or any other robot, I dare say.  We have used the expression ‘automated assistants’ for Project Ghost to calm the fears of our more timid citizenry.  The scientists working on the Project, of course, know that the mission depends on the Gamma Series being deployable, but the people?  No, that would constitute a public relations, how should I say – nightmare and could set back the launch of Aquarius by years.”

“That thing over
Houston was five years ago and Washington is in charge of the Aquarius, not the,” he waved off handedly, “the people.”

“The accidental homicide of thirty-four thousand citizens and soldiers in Christ’s Army was a harsh tragedy and the resulting psychological trauma is proving difficult to expunge.  And of course, any complication or investigations or additional inspections would most assuredly be grounds for prohibitive delays on the Project, delays that would certainly run contrary to your reelection devices.  I believe losing that many jobs could even have a strong negative impact upon the entire Party.”

“Besides, the Gammas can’t function as soldiers,” Sorenson said.

“If they are launching nukes, they are soldiers,” the congressman stated as a matter of fact.

“No, no, you do not understand.  The Gammas in the silos are unaware of the consequences of their protocols.  We know they are launching missiles; they are simply programmed to initiate the launch sequence.  If they knew of the intercontinental ballistic missiles and the rather obvious consequences of completing the assigned tasks, the program would be a total failure.”

The congressman looked at Vincent quizzically.

“We lie to them,” Sorenson said.

“Lie?”

“The truth would interfere with the Three Laws of Robotics.”

“The three laws?”

“Asimov,” Sorenson said.

“Asimov was a science fiction writer from the mid to late twentieth century.  He came up with the Three Laws of Robotics that govern the behavior of robots,” Director Vincent said.  Needly’s blank stare encouraged him to continue.  “The First Law states that a robot cannot harm a human being or through inaction allow a human to be harmed.  If we told the Gammas in the silos what they were truly doing, they would be unable to complete the assigned tasks, understand?”

“I suppose, but why can’t robots harm humans?  How can we use them for hunting terrorists and fighting our wars if we don’t let them kill people, especially terrorist people?  This makes no sense.  Why do we care what some writer said a century ago?”

“We care because he was a genius!” Doctor Sorenson hissed.

“Mr. Needly, please attempt to comprehend this scenario.  If we were to, as you suggest, arm a platoon of Gammas, and then, continuing with your assertion, give them orders to kill humans, which humans exactly should they be, ah – shooting at?” Vincent asked again using that special tone.

“The terrorists, of course, the enemy combatants.”

“And which ones, and please be specific, would those be?”

Congressman Needly glared back and then leaned forward placing his palms on the table.  “The un-American ones.”

Director Vincent smiled a rueful smile.  “And what makes a person un-American?”

“Un-American, you know.  Anyone who isn’t a Christian, anyone who doesn’t believe in the Democracy of the Corporate Union and anyone who doesn’t believe in the Moral Sanctity of Washington and the Government of the United States of America — un-American!”
  Needly caught his breath and leaned back in his chair and licked the spittle off of his fat lips.

“What about un-American Americans?”
  Director Vincent asked.

“We have the camps up and running and the Behavior Modification Implant program is about to launch.  That’s another one of my projects.  Soon we won’t have any un-American Americans,” the congressman responded with a trifle too much enthusiasm, “That behavior modification implant has worked wonders on immigrant populations in the testing phase, turns them into red-blooded American Christians, in Jesus’ name it surely does!”

The others stared at him, unimpressed and slightly sickened.

“Yes, we know, we make them, remember?” Doctor Sorenson reminded.

“Oh yes, right, you are a bunch of tricky bastards, I’ll give you that.”

“Those implants were for accelerated education and environmental conditioning
and adaptation for the Aquarius Project colonists.  We are not the bastards here!” Doctor Sorenson seethed.

Director Vincent interrupted, “The scourge of questionable parentage aside, we’ll stick with the science here at Luna-Dyne and let our good friends in
Washington concern themselves with the morality of the technology, fair enough?  So, detention camps and, ah – re-education notwithstanding – how do you propose we explain this notion of what an American is exactly, to a robot I mean?”

“Aren’t they A.I.?  You just tell them.”

“Artificial intelligence is an illusion, Mr. Needly, clever programming, nothing more.  The robots do not actually think for themselves like we do.  They are not sentient.  We have no way of programming a difference engine to allow a Gamma series robot to decide which humans to kill and exactly when to stop killing them.  We have no methodology under battlefield condition to program a robot to recognize a Christian or a non-combatant.”

“Why not?”

“Because, Mr. Needly, the notion is not possible.  I am afraid you will just have to take my word for it.”

The congressman was not pleased, not because his idea was meeting with resistance, although that was enough to infuriate him, but more so because he was painfully aware that he was the only one in the room that didn’t understand what Director Vincent was trying to explain.

“And I would like to go on record here and say that a robot with a thousand-year plus life span, a nearly indestructible chassis, ultra-advanced weapons technology and an order to kill humans is most assuredly an unpleasant idea of Biblical proportions.”

“Ditto,” Doctor Sorenson agreed.

The congressman looked at General Donahue who glanced away from his vid pad and nodded his assent.  The general knew his business all right, Needly thought and let the idea go.  And then his brow knit with another thought.

“But what about Aquarius?” Needly asked.

“You mean Project Ghost,” Doctor Sorenson corrected.

“We’ll get to Ghost in a moment,” Vincent responded, “General, I would like to include a Beta Series as a command and control unit, a fail-safe for each Deca-Squad of Gammas.  The Betas, while not as effective and creative as the Gammas, are rock solid in the silo testing and are necessary, I feel, for the rather unpleasant business we are about to discuss.  If the Beta measures any anomalous network packets, it can shut the array down, better not to launch at all than a launch error.  Having them networked is a risk to be sure.  There is always the possibility they will breach security and the firewalls and explore external data, such as news channels or the internet and continue to evolve, my apologies congressman, I mean learn – robot jargon, you understand.”

“Is that likely?” the general asked.

“They are quite clever.  We can only assure ourselves that the need of such barbaric weapons will be a footnote in history on the day the silo Gammas learn their true purpose.”

Doctor Sorenson gave a sideways glance at Director Vincent.

Donahue laid his vid pad down on the table, not pleased with the direction this was going.  He thought he had a green light here.  “Accidental launch?” 

“I thought you said everything was a go?” the congressman echoed.

“I shall endeavor to elucidate.  Our problem with the Betas was the Stair Paradox.  Asimov’s Three Laws work wonderfully in the realm of fiction, but proved quite difficult to program for in real life – the world is a dangerous theatre.  The First Law, again, states that robots cannot harm a human being.  That one was delicate, but proved to not be irresolvable.

“The second part of the First Law and the Second Law is where the Paradox comes into play.  The second part states that a robot will not, through inaction, allow a human to come to harm.  The Second Law states that a robot must obey humans as long as those orders do not conflict with the First Law.”

The general nodded in understanding, he’d heard this before.  “The Paradox.”

“Exactly.  As I said, the world is a hostile environment.  Thousands of people are injured and sometimes even killed simply by engaging in everyday activities.  More people are killed at or near their home than anywhere else.  Walking down a flight of stairs can be a tragic accident waiting to happen.  A Beta Series is going to recognize this and act.  Previously, we had no way of engineering an algorithm that could differentiate between the probability of a casual stroll down the stairs or certain injury and death, when all three are simultaneously possible.”

“Schrödinger,” Sorenson said.

“Who?”

“He was a scientist who worked in quantum mechanics in the last century, Heisenberg’s Uncertainty Principle.  For any given mass, you can know speed, but not location; location, but not acceleration — that kind of thing.  Predictability in complex systems becomes increasingly improbable, if not impossible,” Doctor Sorenson replied.

“So who was this Sho-Dinger?” Needle asked.

“Schrödinger demonstrated Heisenberg’s Principle with the allegory of placing a cat in a sealed box.  The box contains an internal mechanism to kill the cat.  As an outside observer, we do not know if the cat is alive or dead until we open the box.  The problem lies in the fact that mathematically, the cat is both alive and dead at the same time,” Vincent said.

“That’s preposterous,” Needly laughed.

The other three gentlemen glanced at each other seriously.

“No,” Sorenson spoke with his own patronizing tone, “this is what we do here, mathematics, we – are – scientists.”

“How can the cat be alive and dead?”

“The mathematics are of no consequence.  The point is that this is how the robot mind functions, especially Betas.  The anomaly is that the Betas see every possibility of injury as equally possible and therefore equally probable under the difference engine paradigm.”

“So what happens?” Needly asked.

“In simulations with one set of stairs and one human, the Beta always disobeys standing direction and assists the human, to insure no harm befalls him.”

“So what is this Paradox?” the congressman asked.

“The Stair Paradox is when we simulate two sets of stairs and two humans.”

Needly leaned forward, anxious.  “What happens then?”

“Nothing happens, the Beta Series freezes and does nothing.  It falls into a circular logic loop, a systems cascade that renders the unit inoperable.  The robots learn you see, so the reasons for freezing are stored in an active sub-routine, adversely affecting future calculations.  The unit must be returned here to Luna-Dyne for a complete memory and systems wipe.  Do you understand that we cannot possibly program how a robot should interact with every situation or object that is possible for it to encounter?

“For example, we take for granted that when we pick up a tomato or grape that we must be gentle.  A robot has no idea of how much pressure to apply in even the most pedestrian of tasks; it must learn over time how much pressure to exert when interacting with various objects.  We have a database from one of the first Beta Series units that we load as a sub routine algorithm for all of our robots.  But the robots still must connect the dots so to speak.  They must learn.  And they all learn differently, especially the Gammas.  As a result, each one is slightly unique, a different personality you could say.  They almost always come to the same conclusion, but reaching that conclusion can vary by as much one hundred nanoseconds.”

Doctor Sorenson leaned forward slightly.  “That’s a lot.”

“You kept saying ‘Beta Series.’  What about the Gamma Series?”

Other books

Prayer by Susan Fanetti
Snow Hunters: A Novel by Yoon, Paul
New Boy by Julian Houston
Wet by Ruth Clampett
The Secret Bride by Diane Haeger
Drop Dead Gorgeous by Heather Graham
Sweetbitter by Stephanie Danler
With and Without Class by David Fleming
New Orleans Noir by Julie Smith