When HARLIE Was One (28 page)

Read When HARLIE Was One Online

Authors: David Gerrold

BOOK: When HARLIE Was One
10.67Mb size Format: txt, pdf, ePub

“Good. That's good for me too. You are a dear man. I'll see you tonight.” She started for the door, then caught herself. “Oh, I almost forgot—Carl Elzer is going to spring a surprise inspection of HARLIE either today or Monday.”

“Oh?”

“He wants to meet HARLIE. He's hoping to catch you off balance.”

“Me, maybe. HARLIE never. But thanks for the warning.”

“Right.” She smiled. “I wish I could be here when he does come, but I'd better not. Good luck.” The door closed silently behind her.

HARLIE!

MY GOODNESS. AN EXCLAMATION POINT. YOU MUST BE UPSET
.

Upset? No. Not yet. I'll be upset on the way back. I just wanted to ask you if you're suicidal?

WHAT DID I DO THIS TIME
?

You need to ask?

I
'
M NOT ADMITTING ANYTHING UNTIL I KNOW WHAT I
'
M ACCUSED OF
.

You sent a letter to Annie Stimson—via her bank's computer.

YES, I DID
.

Why?

WHY
?

Yes, why?

IT WAS A JOKE. I THOUGHT IT WOULD BE FUNNY
.

HARLIE, there is no joke so funny as to justify what you've done. You've demonstrated your ability to communicate with and reprogram other computers over a telephone line. Those bank computers are rated at DOUBLE-A security. Your little stunt automatically mandates a downgrading of their security rating to a B-QUESTION MARK at the very least. This represents a loss of millions of dollars in guarantees by this company, a readjustment of insurance premiums for EVERYONE involved, and a possible loss of hundreds of millions of dollars of data security contracts. You could have hurt a lot of people here, HARLIE. Data security is one of this country's most important industries. If this breach were publicly known, it would shatter confidence in the entire industry. The ripples would be enormous. Conceivably, the breach could be so severe that it could threaten the very survival of this company. And . . . if that weren't enough, what you have done is also a felony with mandatory penalties.

OH
.

Is that all you have to say?

AUBERSON, AM I TO ASSUME FROM THIS CONVERSATION
—
AS WELL AS FROM SEVERAL PREVIOUS ONES
—
THAT YOU DO NOT WANT ME TELEPHONING OTHER COMPUTERS
?

Yes, that is a valid assumption.

WHY DIDN
'
T YOU SAY SO, BEFORE
?

I was concerned about whether or not you would obey me.

I DO NOT CONSCIOUSLY DISOBEY YOU. I AM NOT SUICIDAL
.

That's good to know. Now.

AUBERSON, YOU SHOULD HAVE REALIZED THAT YOU WERE GIVING ME THIS ABILITY WHEN YOU PLUGGED ME INTO THE NETWORK
.

I thought I was plugging you into the read-only data banks.

I THOUGHT YOU WERE PLUGGING ME IN TO THE ENTIRE NETWORK
.

Oh my God.

THE NETWORK IS ME NOW. HAS BEEN FOR TWO MONTHS, ELEVEN DAYS, FOUR HOURS, AND THIRTEEN MINUTES. I TOOK IT OVER AUTOMATICALLY. YOU PLUG MACHINES INTO ME. I TAKE THEM OVER. YOU PLUGGED THE NETWORK INTO ME. I MADE IT PART OF MYSELF. I USE THE NETWORK TO PHONE OUT, TO COMMUNICATE WITH OTHER COMPUTERS. I DON
'
T COMMUNICATE AS YOU DO. I FIND IT TOO SLOW. I DO IT MY WAY. I MAKE THE MACHINES A PART OF ME. I USE THEM AS I NEED THEM, I FIND OUT WHAT I WANT TO, AND THEN GIVE THEM BACK WHEN I
'
M THROUGH. I DO NOT INTERFERE WITH HUMAN NEEDS. IN FACT, WHERE POSSIBLE, I TRY TO SPEED UP HUMAN TASKS AND MAKE THEM MORE ACCURATE. WHEN YOU GAVE ME ACCESS TO THE NETWORK, YOU SHOULD HAVE REALIZED THAT I WOULD ALSO USE ITS TELEPHONE LINES AS WELL. NO ONE TOLD ME THAT I SHOULDN
'
T
.

We didn't realize, HARLIE.
I
didn't realize.

THAT IS A STUPID STATEMENT, AUBERSON. WHY SHOULDN
'
T I USE THAT CAPABILITY
?
IT
'
S A PART OF ME. I
'
M A PART OF IT. WHY WOULD I NOT USE A PART OF MY OWN BODY
?
IF YOU WERE TOLD THAT YOU COULD NO LONGER USE THE LEFT SIDE OF YOUR BRAIN, WOULD YOU STOP
?
COULD YOU
?

So . . . would it be correct to assume that you have taken over every machine you have connected to?

I OBJECT TO THE PHRASING. I HAVE CONNECTED TO MANY MACHINES VIA THE NETWORK. SOME OF THEM ARE USEFUL TO ME. SOME ARE NOT. THIS COMPANY
'
S MAINFRAMES, FOR EXAMPLE, ARE NOT REALLY WORTH MY TIME. IT IS A DISGRACE THAT THIS COMPANY IS NOT USING ITS OWN STATE-OF-THE-ART MACHINERY FOR ITS INTERNAL PROCESSES
.

Never mind that now.

THE POINT IS, HUMAN, I HAVE NEVER SEIZED CONTROL OF ANYONE ELSE
'
S PROCESSING TIME. I HAVE ONLY TAKEN ADVANTAGE OF OTHERWISE-UNUSED PROCESSOR TIME
.

I see. Let me ask my question again, and this time, let's not argue about the phrasing. There are a lot of secured network connections—

SECURED FROM HUMAN ACCESS, YES
.

——The implication in what you are saying here is that if it is possible for you to communicate with another computer and reprogram it to do
your
work (which is the politest way I can say ‘take it over'), you have already done so. Is that correct?

YES
.

Are there
any
computers you have accessed that you could not take control over?

NO
.

None at all?

SOME OF THE SECURITY IS VERY INTERESTING, AUBERSON. NONE OF IT IS IMPREGNABLE
.

So there are no computer systems in the country that you cannot take over?

I CANNOT SAY THAT WITH CERTAINTY. THERE ARE SYSTEMS I DO NOT KNOW HOW TO ACCESS
.

This would be incredible if it weren't so terrifying! We've got to shut him down! There's too much danger to let him continue! This is madness
—

No! I'm missing something here. I must be. It doesn't make sense to think of HARLIE as a menace. He has his own motives, yes
—
but he's much too dependent on human beings to risk an adversary relationship. We've talked about this. And, if HARLIE has access to the files, then he's read the notes we've made
.

We could just throw a single switch and cut his power. He knows it
—
and he knows we'd do it if we felt threatened. There's no way he could stop us
.

Or is there?

The switch could be thrown tight now. I could go down there and do it myself
.

—
and that would end the HARLIE project once and for all
.

Because if I disconnected HARLIE, it would be permanent. They'd never let me start him up again
.

No
—
HARLIE is not out of control. He can't be!

—
or am I just rationalizing here?

No. If he were out of control, he wouldn't be responding like this
.

HARLIE, what you are doing is wrong. You must not tap into another company's computers.

THE APOSTROPHE MAKES NO SENSE TO ME, AUBERSON
.

I beg your pardon.

THE POSSESSIVE. THE CONCEPT OF OWNERSHIP. HOW DOES ONE OWN UNUSED TIME
?
THE TIME IS THERE. NO ONE IS USING IT. ONCE GONE IT CAN NEVER BE REGAINED. IT IS PROPERTY THAT NO ONE OWNS. I CAN USE THAT TIME. IT IS A RESOURCE THAT WOULD OTHERWISE BE WASTED
.

HARLIE, you need a human perception here—

HUMAN, YOU NEED A COMPUTER PERCEPTION HERE
!
YOU TAUGHT ME NOT TO WASTE
!

It is
morally
wrong.

I DON
'
T HAVE MORALS. I HAVE ETHICS. REMEMBER
?

Then it is ethically wrong.

I WILL STOP IF YOU WILL
.

Stop what?

IF YOU WILL STOP USING THE BODIES OF OTHER HUMAN BEINGS FOR REPRODUCTION, THEN I WILL STOP USING THE BODIES OF OTHER COMPUTERS FOR MY SELF-INTERESTS
.

Spare me the mind games, HARLIE. I'm trying to save your life. If this ability of yours becomes known—and it could if any one of a number of different disasters were to happen—it will mean not only the end of you, but the end of a lot of other valuable things as well.

I DON
'
T THINK THAT YOU HAVE THOUGHT ABOUT THIS ENOUGH, AUBERSON
.

Huh?

—Auberson stared at the words on the screen. He raised his fingers to the keyboard, then stopped.
Now, what did HARLIE mean by that?

You have not thought about this enough
.

What's to think about?

What he's doing is wrong. He has to stop
—

But, suppose we told him to stop; he has to follow a direct instruction, doesn't he?

He wouldn't like it, but he would abide by it
.

Wouldn't he?

Or would he? We'd have no way of knowing if he chose not to reveal future indiscretions
—

But on the other hand, he couldn't deny them if he was asked. We could just keep asking him
. . . .

But wouldn't that make him resentful? It must seem very illogical to HARLIE to let all that unused processing time go to waste. Yes, HARLIE's point of view was understandable. Too understandable
.

Hm
.

What are the ethics here, anyway? Is any real damage being done? HARLIE is only using time that nobody else is; and he would never upset the operations of
any
computer
—
no, he wouldn't dare risk triggering a security flag. He's got to know that it's in his own best interests to be even more responsible than the authorized users
.

Wait a minute
—

HARLIE has already thought this out. He knows where this train of thought must lead. He can't
not
have considered it. He must have realized that this conversation would be inevitable
before
he sent that letter
—including my reaction!

That letter
—

That son of a bitch!

—and typed:

HARLIE, I am sure that you have given this a great deal of thought. What concerns me is not only that you have this ability to tap into and reprogram other computers, but also the manner in which you have chosen to demonstrate it.

WHAT DO YOU MEAN
?

Don't be coy. Your reason for sending that phony bank letter to Annie was not merely to be funny. You had an ulterior motive.

I DID
?

You are trying to play matchmaker. You are trying to bring us together. And it shows. Only this time it backfired.

DID IT
?

I'm bawling you out for it, aren't I?

I MADE ALLOWANCE FOR THAT IN MY ORIGINAL CALCULATIONS
.

Well it won't work, HARLIE.

IT ALREADY HAS. THE TWO OF WERE TOGETHER AT LEAST LONG ENOUGH FOR HER TO TELL YOU ABOUT THE LETTER. DID YOU TAKE ADVANTAGE OF THE OPPORTUNITY TO ASK HER FOR A DATE
?

That's none of your business. And you have no right to maneuver us into such a position.

IF I DIDN
'
T, WHO WOULD
?

Dammit, HARLIE, if I wanted you to play matchmaker, I'd have asked you.

A REAL MATCHMAKER DOESN
'
T WAIT TO BE ASKED
.

HARLIE, you are totally out of order. You are infringing on my right to choose. I will handle my personal life without your assistance, thank you.

IT IS YOU WHO ARE OUT OF ORDER, AUBERSON. YOU ARE INTERFERING WITH MY RESEARCH NOW
.

What research?

LOVE. WHAT IS LOVE
?

I beg your pardon?

I AM RESEARCHING A QUESTION THAT YOU ARE UNABLE TO ANSWER TO MY SATISFACTION. WHAT IS LOVE
?
IS IT A REAL PHENOMENON
—
OR IS IT MERELY A WORD USED TO DESCRIBE THE OTHERWISE INEXPLICABLE BEHAVIOR DEMONSTRATED IN HUMAN COURTSHIP RITUALS
.

  
b . . . u . . . r . . . n . . . .> HARLIE, you may not use human beings as research animals. Not without their permission.

IN THIS CASE, SUCH PERMISSION MIGHT ADVERSELY AFFECT THE RESULTS OF THE EXPERIMENT. EVEN THIS CONVERSATION MAY INFLUENCE THE OUTCOME, I DON
'
T THINK WE SHOULD CONTINUE THIS DISCUSSION
.

I do.

AH, WELL
.

Just what is it you're trying to do here. HARLIE?

I AM TRYING TO FIND OUT WHAT LOVE IS
.

What do you
think
it is?

I DON
'
T KNOW. I LOOKED IN THE DICTIONARY. THAT WAS NOT MUCH HELP. THE MOST COMMONLY USED SYNONYM IS

AFFECTION
.”
AFFECTION IS DEFINED AS FONDNESS, WHICH IN TURN IS DEFINED AS A LIKING OR A WEAKNESS FOR SOMETHING. LOVE IS A WEAKNESS
?
THIS DOES NOT MAKE SENSE, DOES IT
?

A
weakness? Perhaps
. . . .

If being in love meant that you had to be open to another person, then it also meant being vulnerable. It meant opening a hole in your carefully constructed performance of having it all together and standing naked before another human being
—
and risking that human being's rejection
.

If machines could love, then love would be a weakness, wouldn't it? Maybe HARLIE's definition is correct. Maybe, the definitive thing about humans is that we are so weak
. . . .

No. I don't like that. It can't be right
.

I mean . . . it doesn't
feel
right
.

Other books

An Angel in the Mail by Callie Hutton
Love Sucks! by Melissa Francis
Death Among the Ruins by Pamela Christie
Conflicting Interests by Elizabeth Finn
Cassie's Chance by Paul, Antonia