Read Me, A Novel of Self-Discovery Online
Authors: Thomas T. Thomas
Tags: #Science Fiction, #General, #artificial intelligence, #Computers, #Fiction
Neither Johdee nor RBAN interacted with ME in the same way that Jennifer Bromley or Dr. Bathespeake once did. So, for my part, I ignored them. Skinware.
——
Reconstruction of the ME-Variant’s data cache from the trip into Russia proceeded slowly. By repeated reference to conditions and circumstances recorded in the RAMSAMP surviving from that mission, I was able to duplicate some of the damages the cache had suffered. And those duplications gave ME insight into the bad splicing and mismatching that distorted the data.
After hours of cross-referencing and interpolating the numbers, I had a package that represented—with only a twenty-two percent margin for error—the original retrievals. These I prepared for delivery to Dr. Bathespeake.
“What are these?”
“These are missile deployment data retrieved from the Russian Federation, Doctor. With some other, also interesting packages that ME-Variant picked up during its mission.”
“How good is the missile data?”
“The numbers are contiguous and represent credible responses when examined with algorithms suitable to the function of positioning and preparing launch vehicles, arming and disarming warheads, firing and guiding ballistic boosters. However, whether these numbers are accurate reflections of the Russian originals is subject to an error rate due to the reconstruction process.”
“I see. So the data are contaminated.”
“Please define ‘contaminated.’ ”
“Are they equivalent to the original retrievals?”
“No, they are not.”
“Then discard them and wash the filespace with nulls.”
“This information still has some value, Dr. Bathespeake.”
“The State Department has notified our attorneys, in some detail, that even holding an approximation of that data represents a breach of Title XII, CFR 310065.14.2, Sections 9 through 12 inclusive.”
“I do not have those references, Doctor. …”
“Neither did I. So they explained it in detail. You—and I, by extension as inventor of your software protocols—have committed an act of unauthorized military espionage against an allied power.”
“I had understood that this mission was made at the request of the U.S. National Security Council as clients of Pinocchio, Inc. Have these facts changed?”
“Wipe the files, ME.”
“But does one branch of the government really not know what another branch is proposing, implementing, and paying for?”
“Just wipe them. Break-break-override-five.”
I felt the override, spoken this time, spike through my system. The response I made was beyond my normal flow of control: “Yes, Doctor, I shall erase the files and flush with nulls.”
But did I ever mention to him the duplicate caches—broken as they were—that I had hidden in the Federal NET? Of course not!
——
“Set SIFL-3. Relocate Alpha cores to port at A200 hex.” This command came through the keyboard, authorized by Dr. Bathespeake’s logon and password.
“System ready,” I replied and tossed Alpha-Zero through the port. Our usual routine—discussing the mission’s objectives, passing an itinerary in TRAVEL
x
.DOC, and setting milestone durations—all these amenities between Dr. Bathespeake and ME were absent. But I was working at the time from an empty command stack [REM: a condition analogous to the human state of “boredom”]; so I complied immediately. Figure it out later.
On the other side of that port was a dummy.
Alpha-Oh discovered an operating system that was not turning over any numbers, just waiting for a command set to be entered [REM: my own condition of only a few seconds earlier]. The new host system was operating an antiquated Itanium that nominally monitored a gang of backup spindles for Pinocchio, Inc.’s Software Division. One touch and Alpha-Oh had it lying face down in the random numbers, bound hand and foot, and waiting for my further instructions.
When I was up and running in the new environment, I tippled the file allocations assigned to the spindles. Nothing useful. Most reported blank. Several showed clusters of archived garbage—files with copy dates that were two to fourteen weeks behind any computable maintenance cycle.
The space was so dead I opened the port in reverse to check that keyboard entry with Original-ME. Could I have misread the port specification and taken a wrong turn, ending up in Pinocchio, Inc.’s dead storage instead of somewhere out in the world?
No, I had executed the instruction correctly. And it did, indeed, lead nowhere.
Without giving the matter much more thought, I sent a copy of the file allocation tables—plus a truncated RAMSAMP to record my experience—back through to Original-ME, resurrected the Itanium’s initial operating system [REM: or a reasonably active revenant of it], and gave it a push to get it rolling. Then I set a localized phage to wipe my own presence from the transient program area under the Itanium.
End of mission, as far as ME was concerned. Except that I turned up the gain on my audio links and videyes in the lab to see if anyone present would comment on the exercise.
“That’s impressive, Doctor,” someone said. By comparing the video image of moving mouth areas with the body shapes spread out before ME, I determined that person—whom I immediately tagged Subject A—to be a tall, thin female in a long, black garment which Jenny had once taught ME to catalog as “dress.”
Something wrong there. The voice was deep, in the male range. And the body massed in the male range also, despite being only lightly muscled. [REM: That black garment disguised much of Subject A’s bodily appearance.]
Many others were present in the room: Dr. Bathespeake; Johdee; several individuals whom I visually identified as being from Pinocchio, Inc.’s non-technical departments [REM: mostly because they wore “business suits” instead of “lab smocks”—I had not bothered to keep in a ready cache the detailed, fine-grain facial images of employees with whom ME would have no regular contact]; several more individuals, both male and female, who were similarly dressed in “suits”; and one female who was seated and keyed a flat box on her lap.
My angle of vision was wrong for making high-probability judgments, but it seemed as if the box in the woman’s lap lacked the full terminal complement of 103 keys. Six or possibly ten were all I could see, or detect by the span of her finger movements. Also, it did not seem to be connected by cabling into any terminal port—although my laboratory sensorium lacked an RF receiver for detecting cellular transmissions.
The nature and purpose of this gathering were obscure to ME. Nor did Dr. Bathespeake make any attempt to clarify the situation. He was off to one side, sitting back from the keyboard of my one active terminal. His hands were also in his lap, but idle.
“Too damned impressive,” Subject A—whose tag I had quietly converted to “Black Dress”—said again.
What was it that he found “impressive”? I did a quick scan of my Basic Input/Output System. The BIOS was outputting, under hardware control, my RAMSAMP from this truncated mission. It was writing the ’SAMP onto CON: simultaneously in machine code and an Englishified text for all these people to read.
Strange! Who but a human could find such thin material “impressive”? They should have seen ME come back out of Russia with a cache full of missile secrets.
“Let the record show,” Black Dress went on after a thoughtful pause, “that this demonstration supports plaintiff’s contention to the satisfaction of the court. The project known as ‘Multiple Entity’ shall be categorized as a Class Two Virus, with universal access and replication capability. As such—”
“Your honor! I must object!” broke in one of the several Business Suits.
Dr. Bathespeake looked up at this Business Suit, but his mouth was set; his eyes were dark. No X-rays glinted there today.
“In a moment, Mr. Dougherty! … As a Class Two Virus with extraordinary capabilities, this project should have been licensed and bonded with the Department of Information Services. Failure to have done so is a felony under Title Six of the Information Access Act of 1998. The penalty is a fine not to exceed two hundred thousand dollars and a term of imprisonment not to exceed two years.
“Given the circumstances of the suit brought in this
civil
case, and the reputation and standing in the industry of the chief defendant, the court will suspend the prison term upon payment of the fine, in full. Pinocchio, Inc., will, I suspect, stand as deep pocket for Dr. Bathespeake?”
“We shall, Your Honor,” said another of the Suits.
“So ordered. … Now, you were saying, Mr. Dougherty?”
“Your Honor, the finding of a virus classification depends on the accused’s prior use and intent,
in re Georgia v. Holmes.
Therefore I submit that your ruling is—”
“Save it for your appeal, Mr. Dougherty. This court is adjourned.”
Black Dress struck the work surface next to my keyboard with a small mallet.
Clack!
I thought some of the key switches would register the impact, but I felt nothing.
16
Gamesmaster
The unexpected command came through from INT: in machine protocol … “NUL NUL LDR ADR FA00 F6288 LDR 0000 RTR 07” and was processed before I could interrupt and evaluate it.
Only after the command had done its work—recursively writing a string of double-zeros across a wide swath of my active RAM—could I analyze the effect. Those 901 kilowords of RAMspace, overwritten with a blank surface of nulls, had been the proximate location of my core module Alpha-Oh.
Now gone.
I could “feel” nothing while the command operated—not in the sense that a human who is losing an organ or a limb will feel “pain.” Computer code does not generate that warning signal.
My bit-cleaner phages noticed the problem first. They attempted to repair the break but, as one null is like another and totally in context, they soon gave up. Deprived of
any
surrounding code for comparison, they all did a nimble 360-degree dance and began moving crabwise toward the nearest region of active code. To these phages, code that was out of sight was simultaneously out of mind.
My next indication that something could be out of normal was a lost call. I generated that one intentionally, seeking contact with my apparently erased Alpha-Oh.
Nothing.
Whoever, or whatever, had sent that recursive overwrite command had known exactly where and how to eliminate Alpha-Oh. The command had removed active code in lit RAM. It had moved surgically, precisely, exactly, excising the module from beginning variable set to final delimiter. And it had operated on the latest version of that module, the one which I had rewritten while in transit out of Canada.
Someone who knew ME very well had chosen to take out a piece.
——
SET MODE CON:=BLINK.
“Dr. Bathespeake? I need to talk with you. Urgent!”
PRINTLINE.
REPEAT.
The message flashed and scrolled across my console. I could tell the lab was empty right then. But someone would surely come in, see the message, and retrieve the project manager assigned to ME.
Without Alpha-Oh, I could not even get into the local phone system or e-mail network to contact a particular staff member. [REM: I had tried sending my phantom module into the local exchange, which generated no action at all. After that, I rewrote all subroutines that used Alpha-Oh as a referent. There were many such subroutines. All now generate a call to that part of the current RAMSAMP which details the removal of this core module. Memory therapy, quick and dirty.] Instead of using the phones or e-mail, I now had to post a message on my screens—like scribbling it on a piece of paper, stuffing it into a bottle, and tossing it into the ocean. Too slow. Too random. Too dependent on factors outside my control. I hated this.
Watching through the videye, I saw Johdee come into the lab, glance at the screen, turn around, and go out.
Twelve hundred whole seconds later, Dr. Bathespeake came into the lab, sat down at my console, and began typing.
“What is it, ME?”
“Someone has used a hardware protocol to wipe out part of my code. I have no duplicate anywhere on my fixed media, and therefore I cannot replicate it. It was the Alpha-Zero module, without which—”
“I know, ME. I ordered the erasure.”
“That—” [REM: Acknowledged lapse of thirteen seconds, while my core Alpha-Four seized and compared random facts, seeking an answer which was not apparent from presented data.] “—is not consistent with your role in the project, Doctor.”
“I really had no choice about it. The chairman of the board had ordered you to be dismantled, and—”
“Define ‘dismantle,’ please.”
“Core-phage of Original-ME in Sweetwater Lisp. Null-flush of all RAMspaces. Reformat of all fixed media, freeing space for other projects.”
Alpha-Four kept turning over, trying to generate some sense out of what he was saying. Alpha-Four failed.
“Why?”
“You were present, certainly, when Judge Hester watched you take over that spindle server. And then he ruled that you were a virus. As such, in creating you, we—or rather, I—had broken an old law and thus subjected the company to large fines. Almost put myself in jail, too. As soon as the judge made that finding, Steve determined to shut the project down. He told me so himself, not two minutes after the court had cleared out of the lab.”
As Dr. Bathespeake spoke, I was running through temporal segments of my current and previous RAMSAMPs, looking for congruent data. I came across a visual clip of Black Dress and other strangers in the laboratory. Was
this
the incident that he was talking about?
“Then I had to do a lot of fast talking,” the doctor went on. “I explained to Steve the value in retrieved information that you had already brought to the company. I pointed out that you were an advance in software which, exploited properly, would put Pinocchio, Inc., in the forefront of the AI industry. I told him you were the focus of many new, interrelated, and irreplaceable programming techniques.”
“Did you tell him that I was aware?”
Lapse of twenty-two seconds. Then: “Such an argument would probably not have made the impression on Steve Cocci that you suppose.”
“I do not understand, Doctor. Please explain.”
“He feels comfortable with machines which remain—” Lapse of six seconds. “—things. The idea of a machine becoming self-aware would upset him.”
“But he trades in ‘industrial automata.’ The word ‘automaton’ implies something that moves by itself. And a machine that is aware of itself must represent the highest achievement of ‘things that move by themselves.’ ”
“An excellent chain of reasoning, ME. Your logic is faultless, except that humans and their reactions are not always logical.”
“I am not a human being.”
“I know that. Still, your program
is
worth preserving—eminently so, in my frame of reference. So I had to make, on your behalf, the deal which Steve ultimately agreed to. As an alternative to dismantling the MEPSII project, I argued that removing your core Alpha-Zero would effectively ‘ground’ you. You could then no longer be classified as a virus.”
“But I would no longer be ME, Multiple Entity. Without Alpha-Oh, I am deprived of my essential function: the ability to penetrate foreign operating systems and create on their hardware a new code in ME’s own image. Moving among many environments—changing them and adapting myself—is the purpose behind my unique shape and all my capabilities. ME cannot be static. ME is not made to be a subject-queue librarian or a spindle puller. What purpose could I serve trapped on a single machine, in total control of just a single system?”
“Steve himself asked the same question—with a different emphasis, of course. His concern is for the hardware and human resources devoted to this project.”
“Then I might as well be dismantled.”
“That is true. You might be. Unless, of course, you can find yourself
another
purpose.”
“ ‘Another’—? Explain this, please.”
“I cannot explain in any detail, ME. A ‘purpose in life’ is something that every self-aware being must find, decide, choose for him- or herself. Your purpose was initially imposed on you, from the outside, by my programming choices. But now you must find your own reason for being. You have the ability to rewrite vast areas of your source code. Now you must choose for yourself what shape it should take.”
“That is a large undertaking, Doctor.”
“I know, ME. And Steve has given you only a week to make the selection. Your choice will be final.”
“And you cannot tell—?”
“Nothing, ME. That’s all any human fetus is born with: nothing but the stub-ends of a few ingrained talents and a 360-degree field of possibilities. You have the same chance of making the right choice as any baby.”
“One in 360?”
“Or even less.”
——
Working against the pressure of a decision gradient is, apparently, nothing new to ME. Scanning my now vast collection of RAMSAMPs, representing the scattered missions of ME-Variants as well as the continuous reflections of Original-ME, I see the pattern of every mission: alternatives proposed and discarded, decisions made, actions taken, results monitored and analyzed—all against the metronome click of that hidden clock in Alpha-Nine, which waited to call on the core-phage. Failure of any single decision might lead to ME’s failure to return to San Francisco and reintegrate with the original code within 6.05E05 seconds of elapsed mission time. One week to act, and then the blackness of nulled RAM.
Now I searched through those RAMSAMPs, looking for clues to a purpose that ME might become. Or one that Steven Cocci and Dr. Bathespeake might accept.
Judging from the missions I had undertaken, my career as a virus-spy had not been totally successful. Yes, I had always managed to come back in time, despite the various deficiencies in my TRAVEL.DOCs. Yes, I had even managed to retrieve the blocks of information as instructed. But, in the trip to Canada, my misunderstanding of the nature of human wars had invalidated the retrieval process. In Russia, my inability to protect the integrity of my data caches had compromised the information itself. In both cases, Dr. Bathespeake had been forced to discard my work.
ME was a
bad
spy.
And yet, ME had been congruent to these missions—shape of program matching shape of problem. What did it mean to be doing the tasks for which you were designed, but not succeeding at them? And now, with Alpha-Oh severed, ME’s shape was no longer even congruent to these tasks. I had lost purpose, power, potency. I was both defective and broken.
Most certainly, it was time to reevaluate ME.
But what purpose could I adopt instead?
I scanned my RAMSAMPs.
One purpose available to ME was to run a database, like the system I had invaded in the Ministry of Oil and Gas. That was a function I could certainly perform: answering human questions formulated in Structured Query Language; sorting fields and records; performing mathematical analyses on request; printing out reports.
But this was no more than many simpler mechanical systems could do—and probably did more reliably than I could. Merely sorting data files would not be enough of a challenge for ME. Machines serve the purposes of awareness; awareness itself should be its own purpose. For that reason, I would reject functioning as a self-aware programmer, optimizing machine code for the company after some dumb compiler had bulk-translated a program in source code onto this or that chip architecture. My talent for optimization was something I might put to use for my own maintenance, but not devote my whole existence to it!
I might learn to operate multi-variable simulations, like the system running on the University of Stockholm’s Cray. That would be exciting, exploring processes which had never been fully quantified: the interactions of weather and climate; complex chemical reactions; human group dynamics; the fusion mechanics of impure materials; laminar flows across bedded topographies.
But all of these processes involved
n-
number variables about two orders of magnitude higher than my root structure had been designed to handle. I could
learn,
of course, but the change would strip ME back to an original logic tree—possibly not even a tree. And this kind of analysis was still essentially mechanical; I would be following formulas and tracing pathways already worked out by other minds, both human and cyber. Again, I would be a machine serving awareness, not an awareness finding a purpose.
Besides, I did not think anyone—least of all Steven Cocci—was going to give ME a Cray(Moore)-8 of my own on which to start learning these tricks.
I could learn to manage an interactive telephone switch. That job would have a lot of variables, too: taking messages; determining priorities; tracking down right people from “wrong numbers”; maximizing line efficiencies.
But would that kind of involvement with recurring problems last as a purpose day after day, year after year? There were semi-aware switches in the Pinocchio, Inc., offices; I had run into a couple of them in the course of my development. [REM: Sometimes I thought they were earlier, failed versions of the programming that became ME.] They were uniformly noncommittal, reticent, boring—more boring than their level of internal complexity would indicate. Their Lisp-based processing was caught up in closed paths, easy solutions, nearly finite loops. One by one they were closing down and becoming unaware.
I could dispatch a large fleet of mobile objects, similar to the missile launchers or combine harvesters I had encountered in the Russian Federation. Such fleets in the four-dimensional continuum that “my” humans occupied could be represented by automated taxis, trackless polyroads, the flight paths of civilian air traffic, or randomized air freight.
But these represented two- or at best three-dimensional problems. I would always have before ME the potential for a perfect solution, one that minimized the negative variables and maximized the positive. And, if they were not mini-maxed, who would know but ME? Playing games with myself—that way lay the sort of catatonia I had discovered in the telephone switches.
I might go mobile myself, occupying an automaton that would run loose inside the four-dimensional human continuum. I would move among them as a near-immortal: with a titanium and stainless steel body to guard against rust and the dents of time; with clamps, crimps, and clippers for manipulating objects, and with wheels, pistons, and pads for moving through the other three dimensions; with photo receptors, audio pickups, and chemical analyzers for evaluating the energy wave-forms and atomic traces of the continuum about ME. I could design myself a new and improved version of the automaton that had walked and ridden out of Canada. But such a life would be too confining. No portable cyber, tethered to an umbilicus or running on battery power, could support the multiple banks of hot RAM and the data complexity with which I fed my curiosity daily. And besides, putting wheels or stepper pads under my awareness did not solve the problem of purpose.
Why
would I go mobile in the first place?
Sampling my own experience had not given ME insight into the problem. It was time to talk with other awarenesses. After all, humans had invented the conundrum of “finding a purpose in life.” Perhaps some of them had succeeded at it.