Authors: Robert Berke
Bayron said, "Oh, I know it can be done, but it adds an infinite number of new variables. What if you can't distinguish your memories from the information available on the internet? What if you get hacked?"
"You can't keep the genie in the bottle, Bayron. Just firewall me as best as you can. I want this done ASAP."
"I can't quantify the risks, Smith."
"Look, Bayron," Smith's measured, artificial voice articulated in its unchanging, even tone, "we've been dealing in unquantifiable risks from the first time we shook hands on this endeavor. This is not a significant increase in risk and you know it. You just want me all for yourself."
"You're my monster, Mr. Smith. I know you understand my concerns." Bayron knew that Smith would get what he wanted. He didn't bother to fight. "I still have the team that designed and implemented the interfaces and all they're doing is monitoring now. I can put them on this right away and probably get you on the internet within a few days. Just, don't go Frankenstein on me."
"Don't frighten me with metaphors. I'm no monster and you're no Dr. Frankenstein."
"Your genie metaphor is equally frightening, Smith. Remember what happened when the genie got out of the bottle."
"I promise to be good."
"At least be careful," Bayron said turning to go back to his lab, "you're still only human, after all."
Just before Bayron left the room, Smith stopped him. "By the way, Doc, any idea why I might have dreamt of Russia last night?"
Bayron stopped in his tracks and turned around. "What?" He said, then repeated, "What do you mean?" Bayron's surprise was undisguised as he walked back to the monitor.
Smith told Bayron about his dream and Bayron's face went flush. "We had to use some parts of the Russian model, but nothing that should have affected your memory. I'm going to chalk this one up to the vagaries of the subconscious. Let me know if you have any waking memories that you can't identify as your own. That would be cause for some concern."
"It's a little more complex than a European Quail, isn't it?" Smith asked rhetorically.
"We're going to know more about how the human brain works than anyone ever has, Mr. Smith, as if we didn't already."
"I hope Bob Hanover got those patents done. You should follow up on that when you get a chance."
"I'll do that," Bayron replied. "In the meantime, let me know if Flat Stanley makes any new appearances."
Before leaving the room, Bayron made some notes in his black spiral notebook.
Bayron's lab retained its office like atmosphere. The only thing that had changed about it in the preceding months was the addition of nearly one hundred monitor screens grouped in sets of various sizes hanging on the walls. The monitors were all measuring or tracking something different. Some displayed line charts, some displayed numbers. Others represented the flow of digital "blood" through the digital arteries of the digitized brain that was now doing all of Smith's lower and median brain functions.
The only person who knew the meaning of all the information on all the monitors was Dr. Bayron. He had grouped the monitors in such a way that, with one pirouette, he could see everything that was going on with Smith's two operating brains and his one useless body.
"Sharky," Bayron said placing a hand on the shoulder of the young engineer in the white labcoat, "come with me."
Sarkis Ohangangian, known around the lab as "Sharky", was Bayron's prot
é
g
é
. His father and pregnant mother fled the destruction caused by the devastating 1988 earthquake in Armenia and came to the United States just months before he was born making him the first member of his family born outside of Armenia. In Armenia, his father had been an electrical engineer. But, being unable to speak English, in the United States his experience, skills, and talent were worthless. His education and intelligence however, were invaluable.
Sharky's father landed his first job in the United States as a machinist in a factory working for cash under the table. Then he found work as an auto-mechanic and ultimately opened his own repair shop. Eventually he opened several shops. By the time Sharky had graduated high school, his father owned a chain of auto supply and mechanics shops in three states. Sharky learned mechanics when he was a little boy. He learned how to weld, how to build engines, how mechanical things worked. This was his first language. As his father became more successful, Sharky gained access to the best schools, the best universities, and the best opportunities.
Sharky's blazing intelligence, natural inquisitiveness, and unstoppable drive made him exactly the kind of person most deserving of the opportunities his father's success could provide.
Sharky followed Dr. Bayron to his office in eager anticipation of whatever challenge Bayron was going to throw at him.
As they sat in Bayron's glass-enclosed outer office, Sharky got the sense that Bayron was tired. He looked old. "He wants internet access." Bayron spoke as casually as a waiter telling a cook that the party at table five wanted soup, but Sharky immediately grasped the significance and gravity of the request.
He summed up his concern succinctly: "Then it won't be a closed system."
Bayron answered the non-question with a non-answer. "We could lose control of the operating environment. What do you think?"
"Firewall of some sort. We can't let him get hacked, but there's no perfect firewall. We'll have to back him up, obviously. Maybe let two artificial brains work in tandem: one that's wired to the 'net and one which would require a disconnect from the net before permitting updates to be made. Maybe even a third brain to decide if any of the information brought in is virulent or malicious." Sharky was brainstorming.
"An id, superego, and ego."
"Funny in a way, isn't it?" Sharky said. "I mean, if the artificial brain would hold all of the elements of his personality, then the id, the superego and the ego would already be a part of the system and there would be no need for three separate brains, would there? Then it would just be an issue of making the mechanical connection to the web, which I could actually do today."
"You put too much faith in both science and people, Sharky, neither will ever fail to disappoint you." Bayron sighed. "Think negative for a minute."
Sharky raised one eyebrow to call attention to the irony in Bayron's last remark, and Bayron realized what Sharky apparently already knew: that the entire project was predicated on faith in science and a love for people.
Bayron felt comfortable sharing his concerns with Sharky. He was, after all, a very human engineer.
Sharky continued brainstorming, "Here's a negative for you. Even if we didn't have to worry about corruption to the Smith model, maybe we have to worry about mingling a sentient and intelligent data set with the repositories of all human information. If Smith were malicious, he would be an absolutely unstoppable virus capable of infinite adaptations. And I mean unstoppable. Theoretically, he could replicate himself an infinite number of times and filter every single piece of information that exists in electronic form, which is everything. Smith could put whatever spin he wanted on the entirety of human knowledge. If he woke up one day and decided that the moon was made of green cheese, he could insert that fact into every repository of human information in an instant. He'd be the ultimate arbiter of truth."
Bayron looked Sharky square in the eyes, "Not just of truth. He'd be the ultimate arbiter of reality itself. Once he controls the sources of truth, there would be no difference between truth and lies. The facts would cease to exist and they would be replaced with whatever Smith decided the facts should be. He would be the mind of the entire world. That's a fellow you would not want to piss off." Bayron quipped. "He'd literally be able to make sure that no differing opinions could ever be aired."
"Did you ever try to get toothpaste back into a tube?" Sharky asked. "There'd just be no fixing even a single instance of that kind of thing. Remember that kid who still gets all those get well cards? You can't just undo something once it proliferates over the Internet.
"Okay, let's think about it tonight and make some decisions tomorrow." Bayron concluded.
Sharky lived with his mother. Even though he was paid very well by SmithCorp-- very, very well-- he could never leave his mother alone. Since his father had died she almost never left the house. The bedroom he slept in since he was a little boy was still his best thinking place.
"Sako," his mother called him by his childhood nickname as he came in the front door, "Sako, come eat. I made tacos." Her thick accent made the word 'tacos' sound like a traditional Armenian dish even though she had never seen or heard of a taco until Sharky was a teenager. He was embarrassed by the ethnic food his mother prepared. He wanted American food. Hamburgers, macaroni and cheese, crunchy tacos and the like. His mother, always accommodating of his wants, not only learned to make these "American" dishes, but actually enjoyed them herself.
"I have a lot of work to do, ma, can you fix me a plate to take up with me?"
"Ach, my little working man. Just like your papa. I make you nice plate and bring it up. You make me proud."
Sharky climbed the stairs to his bedroom feeling more and more exhausted with each step. He lay down on his bed and looked at the ceiling, his mind awander. He knew exactly how to get Smith on the Internet. He had worked out the details on his way home. No challenge in that. But, he knew that was not what Bayron had wanted him to think about.
Was it possible that Bayron hadn't considered this before? Was it possible that this issue had only just crossed his mind? Even he had thought about some of the wider implications of the project himself but always felt comfortable in the knowledge that the ethics of the situation were Bayron's problem.
Now Bayron was looking to him for ethical guidance. He didn't like it.
Sharky opened the drawer on his nightstand and took out a small pipe. He patted down the marijuana in the bowl and decided there was enough there for one good hit. He lit the bowl with a disposable lighter and as he lit it a small flame shot up, reminding him that he was reaching for heaven.
No sooner had he exhaled than the door to his room opened. His mother came in with a plate of tacos and a glass of coca-cola. "This is what you call working?"
"Just needed to relax, mom."
"You work too hard already. Sit up and eat."
He sat up in bed and took a bite out of one of the tacos. That made him feel better. But his mother could tell that her son's mind was unsettled.
She sat on the edge of his bed and asked, "Is everything good at work?"
"Yeah. Dr. Bayron's great, you know, but... mom, let me get your opinion on something."
"The genius wants my opinion now."
"If you had the power to create something that you couldn't control, would you do it?"
"I created you." She gestured towards the remaining wisps of smoke that lingered below his ceiling light. She knew it was marijuana. She wished he didn't do it. "I don't do too good controlling you Sako, do I? But you always make me proud."
"But you never really did try to control me, now did you?"
"Control what, Sako? You was always a good boy." She pinched his chin as if he were an infant.
"But you didn't know that. In fact, you really had no idea how many temptations there are out there, you didn't grow up here. How could you just let me do my own thing not knowing what I could get involved in?"
"You think I was a bad mother?"
"No," Sako replied with an apologetic tone, "you were the best. But you know you took a big chance bringing me up in this country where you didn't know the people or the culture. I could have gone wrong so easily."
"I never worried, Sako. You come from good people and good people is good people. Should I have locked you away? Chained you to a radiator?"
"A lot of my friends weren't allowed to leave their houses or go to the store alone, but you always let me go."
"And you always came back. Why you bring all this up now? Does this have something to do with your artificial intelligence programs? Did you make something important?"
Sharky was not surprised at his mother's guess. Until he joined Bayron's team he had been working on artificial intelligence project for SmithCorp under a military contract. He had to obtain a security clearance for that and wasn't allowed to tell his mother any details of that work. When he joined Dr. Bayron's team, he was sworn to an even higher level of secrecy. His mother did not know that he was no longer working on artificial intelligence but on artificial life itself, and he couldn't tell her. He was glad she had used him as an example. It allowed him to get her opinion without having to tell her what he had really been a part of.
"No, ma," he said, "Real artificial intelligence is still a long way off. But from a philosophical perspective, if we were to create an artificial intelligence, it would be at least a little like having a child."
Sako's mother chuckled, "oh, I suspect the process would be very different."
"But seriously, ma," Sharky continued, "let's say I did create a kind of artificial intelligence. I think it would be very much like having a child. You know, you send it off into the world, you don't know what its going to learn, you don't know what its going to do. Yet people do that all the time. So I mean, look: here we are, it's a country none of us have ever been in before and you say, ‘go, do, learn. Do what you're going to do, learn what you're going to learn, be whatever your going to be.' That's incredibly dangerous really, if you think about it."