Authors: Greg Egan
Supporters of the Strong AI Hypothesis insisted that consciousness was a property of certain algorithms -- a result of information being processed in certain ways, regardless of what machine, or organ, was used to perform the task. A computer model which manipulated data about itself and its "surroundings" in essentially the same way as an organic brain would have to possess essentially the same mental states. "Simulated consciousness" was as oxymoronic as "simulated addition."
Opponents replied that when you modeled a hurricane, nobody got wet. When you modeled a fusion power plant, no energy was produced. When you modeled digestion and metabolism, no nutrients were consumed -- no
real digestion
took place. So, when you modeled the human brain, why should you expect
real thought
to occur? A computer running a Copy might be able to generate plausible descriptions of human behavior in hypothetical scenarios -- and even appear to carry on a conversation, by correctly predicting what a human
would have done
in the same situation -- but that hardly made the machine itself conscious.
Paul had rapidly decided that this whole debate was a distraction. For any human, absolute proof of a Copy's sentience was impossible. For any Copy, the truth was self-evident:
cogito ergo sum.
End of discussion.
But for any human willing to grant Copies the same reasonable presumption of consciousness that they granted their fellow humans -- and any Copy willing to reciprocate -- the real point was this:
There were questions about the nature of this shared condition which the existence of Copies illuminated more starkly than anything which had come before them. Questions which needed to be explored, before the human race could confidently begin to bequeath its culture, its memories, its purpose and identity, to its successors.
Questions which only a Copy could answer.
+ + +
Paul sat in his study, in his favorite armchair (unconvinced that the texture of the surface had been accurately reproduced), taking what comfort he could from the undeniable absurdity of being afraid to experiment on himself further. He'd already "survived" the "transition" from flesh-and-blood human to computerized physiological model -- the most radical stage of the project, by far. In comparison, tinkering with a few of the model's parameters should have seemed trivial.
Durham appeared on the terminal -- which was otherwise still dysfunctional. Paul was already beginning to think of him as a bossy little
djinn
trapped inside the screen -- rather than a vast, omnipotent deity striding the halls of Reality, pulling all the strings. The pitch of his voice was enough to deflate any aura of power and grandeur.
Squeak.
"Experiment one, trial zero. Baseline data. Time resolution one millisecond -- system standard. Just count to ten, at one-second intervals, as near as you can judge it. Okay?"
"I think I can manage that." He'd planned all this himself, he didn't need step-by-step instructions. Durham's image vanished; during the experiments, there could be no cues from real time.
Paul counted to ten. The
djinn
returned. Staring at the face on the screen, Paul realized that he had no inclination to think of it as "his own." Perhaps that was a legacy of distancing himself from the earlier Copies. Or perhaps his mental image of himself had never been much like his true appearance -- and now, in defense of sanity, was moving even further away.
Squeak.
"Okay. Experiment one, trial number one. Time resolution five milliseconds. Are you ready?"
"Yes."
The
djinn
vanished. Paul counted: "One. Two. Three. Four. Five. Six. Seven. Eight. Nine. Ten."
Squeak.
"Anything to report?"
"No. I mean, I can't help feeling slightly apprehensive, just knowing that you're screwing around with my . . . infrastructure. But apart from that, nothing."
Durham's eyes no longer glazed over while he was waiting for the speeded-up reply; either he'd gained a degree of self-discipline, or -- more likely -- he'd interposed some smart editing software to conceal his boredom.
Squeak.
"Don't worry about apprehension. We're running a control, remember?"
Paul would have preferred not to have been reminded. He'd known that Durham must have cloned him, and would be feeding exactly the same sensorium to both Copies -- while only making changes in the model's time resolution for one of them. It was an essential part of the experiment -- but he didn't want to dwell on it. A third self, shadowing his thoughts, was too much to acknowledge on top of everything else.
Squeak.
"Trial number two. Time resolution ten milliseconds."
Paul counted. The easiest thing in the world, he thought, when you're made of flesh, when you're made of matter, when the quarks and the electrons just do what comes naturally. Human beings were embodied, ultimately, in fields of fundamental particles -- incapable, surely, of being anything other than themselves. Copies were embodied in computer memories as vast sets of
numbers.
Numbers which certainly
could be
interpreted as describing a human body sitting in a room . . . but it was hard to see that meaning as intrinsic, as
necessary,
when tens of thousands of arbitrary choices had been made about the way in which the model had been coded.
Is this my blood sugar here . . . or my testosterone level? Is this the firing rate of a motor neuron as I raise my right hand . . . or a signal coming in from my retina as I watch myself doing it?
Anybody given access to the raw data, but unaware of the conventions, could spend a lifetime sifting through the numbers without deciphering what any of it meant.
And yet no Copy buried in the data itself -- ignorant of the details or not -- could have the slightest trouble making sense of it all in an instant.
Squeak.
"Trial number three. Time resolution twenty milliseconds."
"One. Two. Three."
For time to pass for a Copy, the numbers which defined it had to change from moment to moment. Recomputed over and over again, a Copy was a sequence of snapshots, frames of a movie -- or frames of computer animation.
But . . . when, exactly, did these snapshots give rise to conscious thought? While they were being computed? Or in the brief interludes when they sat in the computer's memory, unchanging, doing nothing but representing one static instant of the Copy's life? When both stages were taking place a thousand times per subjective second, it hardly seemed to matter, but very soon --
Squeak.
"Trial number four. Time resolution fifty milliseconds."
What am I? The data? The process that generates it? The relationships between the numbers?
All of the above?
"One hundred milliseconds."
"One. Two. Three."
Paul listened to his voice as he counted -- as if half expecting to begin to notice the encroachment of silence, to start perceiving the gaps in himself.
"Two hundred milliseconds."
A fifth of a second. "One. Two." Was he strobing in and out of existence now, at five subjective hertz? The crudest of celluloid movies had never flickered at this rate. "Three. Four." He waved his hand in front of his face; the motion looked perfectly smooth, perfectly normal. And of course it did; he wasn't watching from the outside. "Five. Six. Seven." A sudden, intense wave of nausea passed through him but he fought it down, and continued. "Eight. Nine. Ten."
The
djinn
reappeared and emitted a brief, solicitous squeak. "What's wrong? Do you want to stop for a while?"
"No, I'm fine." Paul glanced around the innocent, sun-dappled room, and laughed.
How would Durham handle it if the control and the subject had just given two different replies?
He tried to recall his plans for such a contingency, but couldn't remember them -- and didn't much care. It wasn't his problem any more.
Squeak.
"Trial number seven. Time resolution five hundred milliseconds."
Paul counted -- and the truth was, he felt no different. A little uneasy, yes -- but factoring out any squeamishness, everything about his experience seemed to remain the same. And that made sense, at least in the long run -- because nothing was being omitted, in the long run. His model-of-a-brain was only being fully described at half-second (model time) intervals -- but each description still included the results of everything that "would have happened" in between. Every half-second, his brain was ending up in exactly the state it would have been in if nothing had been left out.
"One thousand milliseconds."
But . . . what was going on, in between? The equations controlling the model were far too complex to solve in a single step. In the process of calculating the solutions, vast arrays of partial results were being generated and discarded along the way. In a sense, these partial results
implied
-- even if they didn't directly represent -- events taking place within the gaps between successive complete descriptions. And when the whole model was arbitrary, who was to say that these implied events, buried a little more deeply in the torrent of data, were any "less real" than those which were directly described?
"Two thousand milliseconds."
"One.
Two.
Three.
Four."
If he seemed to speak (and hear himself speak) every number, it was because the effects of having said "three" (and having heard himself say it) were implicit in the details of calculating how his brain evolved from the time when he'd just said "two" to the time when he'd just said "four."
"Five thousand milliseconds."
"One. Two. Three. Four.
Five."
Besides, hearing words that he'd never "really" spoken wasn't much stranger than a Copy hearing anything at all. Even the standard millisecond clock rate of this world was far too coarse to resolve the full range of audible tones. Sound wasn't represented in the model by fluctuations in air pressure values -- which couldn't change fast enough -- but in terms of audio power spectra: profiles of intensity versus frequency. Twenty kilohertz was just a number here, a label; nothing could actually
oscillate
at that rate. Real ears analyzed pressure waves into components of various pitch; Paul knew that his brain was being fed the preexisting power spectrum values directly, plucked out of the nonexistent air by a crude patch in the model.
"Ten thousand milliseconds."
"One. Two. Three."
Ten seconds free-falling from frame to frame.
Fighting down vertigo, still counting steadily, Paul prodded the shallow cut he'd made in his forearm with the kitchen knife. It stung, convincingly.
So where was this experience coming from?
Once the ten seconds were up, his fully described brain would
remember
all of this . . . but that didn't account for what was happening
now.
Pain was more than the memory of pain. He struggled to imagine the tangle of billions of intermediate calculations, somehow "making sense" of themselves, bridging the gap.
And he wondered:
What would happen if someone shut down the computer, just pulled the plug -- right now?
He didn't know what that meant, though. In any terms but his own, he didn't know when "right now"
was.
"Eight. Nine.
Ten."
Squeak.
"Paul -- I'm seeing a slight blood pressure drop. Are you okay? How are you feeling?"
Giddy -- but he said, "The same as always." And if that wasn't quite true, no doubt the control had told the same lie. Assuming . . .
"Tell me -- which was I? Control, or subject?"
Squeak.
Durham replied, "I can't answer that -- I'm still speaking to both of you. I'll tell you one thing, though: the two of you are still identical. There were some very small, transitory discrepancies, but they've died away completely now -- and whenever the two of you were in comparable representations, all firing patterns of more than a couple of neurons were the same."
Paul grunted dismissively; he had no intention of letting Durham know how unsettling the experiment had been. "What did you expect? Solve the same set of equations two different ways, and of course you get the same results -- give or take some minor differences in round-off errors along the way. You
must.
It's a mathematical certainty."
Squeak.
"Oh, I agree." The
djinn
wrote with one finger on the screen:
(1 + 2) + 3 = 1 + (2 + 3)
Paul said, "So why bother with this stage at all?
I know --
I wanted to be rigorous, I wanted to establish solid foundations. But the truth is, it's a waste of our resources. Why not skip the bleeding obvious, and get on with the kind of experiment where the answer isn't a foregone conclusion?"
Squeak.
Durham frowned reprovingly. "I didn't realize you'd grown so cynical so quickly. AI isn't a branch of pure mathematics; it's an empirical science. Assumptions have to be tested. Confirming the so-called "obvious" isn't such a dishonourable thing, is it? And if it's all so straightforward, why should you be afraid?"