The Jovian Run: Sol Space Book One (13 page)

BOOK: The Jovian Run: Sol Space Book One
5.47Mb size Format: txt, pdf, ePub

Staples mused for a minute. “Okay. I want the upgrade, no question, but I’m going to ask you to wait. I’m expecting to hear from a friend of mine on Mars about an inquiry I made. I was actually hoping to have received it by now. I’d like to wait until I get that message before we go coms dark. Once I do, I’ll give you the go-ahead. Sound good?”

“Sounds good,” he replied, and headed for the ladder.

 

Day five.

              “Your move,” Yegor said in Russian, a smile on his lips.

              Piotr looked at him in a way that clearly communicated that he was aware of the fact. His dark eyes surveyed the board for another minute in silence, and Yegor took another sip of his bourbon. Finally the bald cook moved the magnetic rook across the board and positioned it to threaten his opponent’s bishop.

              “You can’t rush greatness, my friend,” Piotr said, much more comfortable and fluent in his native tongue. “How goes your grand project?” He reached for his own metal cup of the Kentucky bourbon that he had smuggled onboard the ship back on Earth.

              “Slowly,” Yegor replied as he surveyed the chessboard’s new and more threatening configuration. Piotr’s move was not one he had failed to anticipate, but he did not expect such a bold move from his fellow countryman. “There are a dozen connection points that need to be tested, each with its own passkeys, and once that’s done-”

              “Mm hm,” Piotr interrupted, ably communicating his interest in chess over the complexities of integrating communications suites built by different manufacturers.

Yegor looked at him. “That’s really why I invite you here, Piotr. You’re such a great listener.”

“If only you talked as well as I listen,” he rejoined, and Yegor smiled and bent back to the game. Piotr looked around the coms officer’s quarters, spare as they were, his eyes eventually settling on the small bookshelf his friend maintained. The books were held in place by a piece of twine across their spines. There were the classics he expected from their shared country of birth of course, but he had found that Yegor had a curious love for American literature as well. Mark Twain was hiding between Anton Chekov and Leo Tolstoy, and William Faulkner had cozied right up to Fyodor Dostoyevsky. As far as he was concerned, the best thing to come out of the American south was currently warming his belly. He took another sip.

Yegor slid a magnetic pawn into place to protect his bishop. “Then you talk and let me listen.” His face grew more serious. “How is your sister? The children?”

Piotr shook his head. “The same. Poor. I’d like to find that bastard.”

“Then you’d be in jail and she really would have no one to help her.” His face was sympathetic. “How’s the motherland?”

Piotr snorted derisively. “You watch the news.” His bishop slid out from the back row to threaten the offending pawn. Pieces were quickly piling up on this one exchange, and the scent of blood was in the air.

“I try not to, actually. Home isn’t home anymore.”

Piotr nodded. He had known of his friend’s familial trouble for some time. “Then why ask me?”

“So…” he reached out to touch a piece, then withdrew his hand as he reconsidered. “I can show you what a good listener I am.” He smiled absently as he focused on the board.

“Well, the lines have been stable around Moscow for some time. The hardliners won’t give up the capital, and the rest of the country doesn’t want a bloody battle. The whole reason they began fighting against Moscow was their totalitarian rule, their intolerant views, and their bloody tactics. It doesn’t make much sense to kill them all for their beliefs, even if those beliefs are terrible. Anti-Semitic, anti-gay, anti-freedom… they’re like some relic of the 20
th
century, and they just can’t accept that the world has moved on.”

Yegor was silent a moment longer, then brought a knight to bear, further increasing the potential casualties should a battle ensue. “Seems like there’s always someone digging in their heels and trying to hold back progress. I’m just embarrassed that it’s us... well, our country, anyway.”

Piotr shrugged. “I say fight.” At first, Yegor didn’t know whether he was referring to the situation in Russia or the chessboard in front of them, but then he continued. “Some people you can reason with. Some people you can’t. It’s like the Nazis. You can’t talk them out of their idiocy.”

Yegor sighed. “Maybe you’re right. Too bad.” He took another sip, reached for a piece, and girded himself for battle.

 

Day seven.

              As she made her way to the mess hall, Staples heard two voices. John Park and Don Templeton, she thought, and wondered what they were doing up at this hour. It was nearly midnight ship time, and after an hour reading Kyd’s
The Spanish Tragedy
, she had just given up on sleep for the time being and decided to rummage a raspberry yogurt from the galley. The two men seemed to be arguing by the tones of their voices, but they stopped abruptly as she padded into the room in her slippers and robe. Park was wearing a pair of sweatpants and a ratty tee shirt, and Templeton was dressed in his usual flight jacket and slacks. Staples wondered idly whether he washed them frequently or simply had seven sets of the same outfit. They were both looking at her. Beyond them, at the other end of the table, Piotr sat quietly regarding them like a member of a crowd at a tennis match.

              She gave a lopsided grin and continued walking over to one of the refrigeration units. “Don’t stop on my account.” She fished around inside, unclasped a yogurt from its plastic holder, and lifted a metal spoon out of the magnetic silverware tray. The men did not continue. Staples sat down at the table near Don, placed her spoon and plastic cup on the surface, and rubbed her eyes tiredly. When she had finished, she looked back and forth at the two of them. “No, really. What’s the subject of discussion?”

              Templeton regarded a spot on the table pointedly, but John said, “The AI research bill. They’re voting on it – again - in a few months. We’ve been having a little debate.”

              “I see.” She opened her midnight snack, spooned out a bit, and leaned forward eagerly as if about to watch two champion poker players begin a game.

              Templeton still seemed reluctant to discuss the matter in front of his captain, but Park had no such reservations.              

“My first mate here believes that we should stop progressing as a species.”

              The gross oversimplification did its job, and Templeton rose to the bait. “I didn’t say that at all, and you know it. I just said that in some areas, there should be limits. Look, take weapons. Let’s say some scientist says he can build a weapon that will blow up a planet. Do you fund that? What if he says that he’s going to spread the research all over netlink? Do you try and stop him? ‘Cause you know some kook out there is gonna use it.”

              “Fair point,” was Park’s tame rejoinder, “for weapons. But Artificial Intelligence isn’t a weapon.”

              “It could be.”

              “So can this ship. Do you know how much damage this ship could cause to a city? To a planet? If we ran at one G of thrust for six weeks, we’d be going twelve percent the speed of light. Can you imagine if we plowed into New York City going that fast?”

              “There’s safety protocols to stop that,” Templeton said weakly.

              “Anything that can be invented can be circumvented. There’s a counter for everything.”

              “Maybe. But ships ain’t what we were talking about either. You’re talkin’ about creating something just as smart as a human. We get smarter as we get older, learn more and more, ‘cept there’s a limit on that. We grow old and die. Machines don’t die. They just get older and smarter and pretty soon some AI is at three hundred IQ and what if it decides it don’t like humans so much?”

              “Oh, the old ‘what if’ argument. You say ‘what if it’s evil.’  I say ‘what if it’s good?’  Imagine the problems it could solve, the technology it could create, the diseases it could cure.”

              “Why should it give a crap about us? It’s not like it’ll be human.”

              Park gestured at the table with his index finger as he spoke. “But if we
teach
it to be good.”

              Templeton guffawed. “Teach it?” He asked incredulously. “It’s a damn machine. Do you know how naïve you sound? You’re gonna teach it right from wrong? Who says it’ll even acknowledge those concepts? I’m not even sure
I
acknowledge those concepts. And if it’s really AI, then won’t it get to choose what to do with its lessons like any other intelligent person?”

              Suddenly John seemed to be on the defensive. “Well, it would have to be monitored, safeguarded, have a kill switch, that sort of thing.”

              Templeton leaned forward, savoring the moment. “‘Anything that can be invented can be circumvented.’”

              John rolled his eyes. “Okay, yeah, but this would be different.”

              “How so?”

              “Well, the scientists who are programming the AI would have total control; they could monitor programming, brain functions if you will, and if things stray into the danger zone, they flip the switch.”

              “I’m afraid,” Staples interjected, putting her spoon down, “that this opens up an entirely new can of worms. You’re talking about terminating a sentient intelligence because you don’t like what it’s thinking. Once you’ve created life, do you have the right to destroy it?”

              John and Templeton looked at her. “Well, it’s not life,” Templeton answered. “It’s just programmed to act like it.”

              “Can you prove the difference, Don? A Turing Compliant computer is indistinguishable from a human being. That’s what Turing Compliant means. The problem isn’t how do we create artificially intelligent life. The question is: what do we do with it once we have?” She placed both of her hands on the table, one over the other, and looked back and forth to the two men as she spoke.

              “What you use AI for?” Piotr’s heavily accented voice came from the end of the table, startling them. They had nearly forgotten he was there.

              John, still championing his cause, replied, “Whatever else you use a computer for, but better. Imagine a computer able to make decisions, weigh moral consequences, on a battlefield. It could stop the loss of human life.”

              “Great. Then there’d be
no
reason not to go to war,” Staples muttered.

              “Okay, bad example. Imagine a self-aware machine performing surgery, or exploring outer space, or babysitting your children, or… I don’t know, running a space station. This whole journey we’re taking could be avoided; instead of having to fly highly trained specialists to the far side of Saturn, which is, I’ve heard, not a really fun to place to live, we could just transmit a computer program or launch a computer out there, and… done!”

              “Nice job, John,” Templeton said smugly. “You just talked yourself out of a job. See you at the unemployment office.”

              “Look, I’m not talking about replacing
people
, just about the possibility of us growing as a species with the help of a new tool, the way we always have,” John replied. It was becoming clear to Staples that his theoretical points outstripped his ideas about practical application.

              “I have question.” Piotr’s deep voice came again from the end of the table. “What if AI doesn’t want to be explorer, or babysitter, or tool?”

              Staples nodded in agreement. “In many ways, that’s the crux of the matter. Most people are concentrating on all the bad things that could happen if full AI research gets the go-ahead and it goes bad. The truth is, some kids go bad too. We don’t stop having children because they might turn out to be serial killers or thieves. The real problem here is rights. If you’re going to create sentient life…” Templeton looked at her in objection. “… fine, just call it sentience. If you’re going to create sentience, truly self-aware machines, then you have to grant them the same rights you grant every other person. Life, liberty, all of that. If you don’t, it’s slavery, a very ugly thing that we’ve worked very hard to stamp out as a species.”

              “But if you
program
them to do a certain job…” Park ventured.

              She looked at him. “Does Gwen always do what you ask her to?” She didn’t bother to wait for an answer before continuing. “Does every child raised in a given religion choose to follow that religion in their adult years? The essence of sentience is choice. We are currently the only sentient species that we know of. Plenty of animals will die to defend their homes, or their young, or their mates, but we’re the only ones that will die to defend an ideal. The rational part of our brain can suppress the animalistic part, our survival instinct… our programming, if you will. We can choose to ignore millions of years of evolutionary programming. You have to grant that if we create a sentient machine, it will have that choice as well. If it doesn’t, then we haven’t created sentience. When you’re talking about inventing AI, you’re talking about inventing another species, and as someone pointed out to me recently, we’re not very good at sharing the spotlight.”

Other books

Wedding Day Murder by Leslie Meier
Jumping at Shadows by R.G. Green
Last Rites by John Harvey
Oracle by David Wood, Sean Ellis
Bread (87th Precinct) by McBain, Ed
Nothing More by Anna Todd