The Turing Test Page 18
Taylor motioned Frank to stand next to him and whispered, “Before we leave, I have to turn off the monitors. After that, we’re going to want to move as quick as we can. Someone at the nurses’ station will notice they’ve gone dead, if they’re paying attention.”
“Okay, Mr. Steiner,” Taylor murmured. “We’re going to help you get into this wheelchair now.” He took Jerry’s IV bottle off the pole by his bed and attached it to the one on the wheelchair. Then Taylor put one arm behind Jerry’s back and another under his legs and swiveled him around until he was sitting on the side of the bed with his legs hanging.
“Okay,” he said softly to Frank, “we’re going to link hands behind his back and under his legs and then lift him into the chair. On the count of three we’ll pick him up – slowly – not with a jerk. Ready?” Frank nodded. “Okay. I’ll grip your forearms and you grip mine. Good. One, two, and three.”
Frank straightened up from an awkward angle and felt Taylor’s hands tighten on his arms. A sharp pain shot across the small of his back as Jerry rose into the air, his head falling forward on his chest.
“Good job,” Taylor said. “Now I’ll rotate around you, and we’ll settle him into the chair. Okay. Now down.”
Taylor was moving quickly now. He turned off the life-signs monitor, pulled the oxygen monitor off Jerry’s index finger, and reached inside his hospital gown to pull off the cardiac sensors. Last of all, he pulled a blanket off the bed and tucked Jerry in to keep him warm. “Okay. Let’s go,” he said, putting one hand on Jerry’s shoulder to keep him from flopping forward.
Frank and Shannon followed Taylor into the elevator and down to the emergency ward on the first floor where an ambulance was supposed to be waiting for them. But there was no ambulance.
“Now what?” Frank whispered.
“Don’t worry. I’m sure it will be here in a minute. I’ll check in and see where they are.” He stepped through the automatic doors into the loading area outside.
“I don’t like this,” Shannon whispered. “What if they’ve already noticed Jerry’s missing? And what if Jerry has a heart attack in half an hour and dies? What do we do then?”
“I don’t know; run the tape backwards and smuggle him back into his bed?”
“Not funny.”
To their relief, an ambulance pulled up outside and Taylor reappeared. He gave them a wave, and Frank wheeled Jerry out. The driver helped Taylor this time, and in no time Jerry was strapped onto the collapsible gurney in the back of the ambulance. Frank and Shannon climbed inside, and they were on their way.
They said nothing while Taylor checked Jerry’s vital signs again. “All good,” he said, looking up. For the first time, Frank noticed a faint shadow of fuzz on his upper lip, masquerading as a moustache.
“When you say ‘all good,’ how good is good?”
“From his chart, it looks like there’s nothing to worry about. He had a terribly rough ride for ten minutes yesterday, though. In addition to a programmable pacemaker, he’s got a defibrillator. That’s a device that gives the heart a single jolt if it goes into fibrillation, which means a state of uncoordinated contractions. The idea is to shock the heart back into a normal rhythm. It’s highly effective, but when it fires, it feels like getting kicked in the chest by a horse. Based on the notes the EMT team wrote up, it must have kept firing every few seconds instead of firing just once. He couldn’t have survived that much longer, but there may not have been much damage to the heart itself.”
“May?” Shannon said.
“You had us grab him before they could do any tests to find out.”
“Oh.”
“Anyway, he’s still heavily sedated. He should come out of it gradually over the next five to ten hours. But I’ll keep him under light sedation for another day at least so he can rest.”
“What do you think is happening back at the hospital right now?” Shannon asked.
For the first time, Taylor smiled. “I expect there’s a little confusion up there by now. They’ll look for him in the bathroom, and when they see he’s not there, they’ll check his chart to see if he was taken downstairs for tests. When they draw a blank on that, they’ll look around the floor to see whether he decided to take his IV stand for a walk. And when that doesn’t pan out, either, they’ll call security.”
Shannon looked worried. “Then what?”
“By then whoever sent me to help you out will have gotten in touch with someone at the hospital to make the problem go away. That’s the way things are supposed to happen, anyway. Usually they do.”
“Great,” Shannon said, looking no less concerned.
Taylor leaned back on the bench seat and pulled out his phone, and they rode in silence until the ambulance slowed and gave a lurch. Shannon grabbed the edge of the bench seat as they turned onto what felt like a dirt road. A few minutes later, they rolled to a stop, and Taylor opened the doors. And there was Frank’s camper, right where he’d parked it before dawn that morning at the edge of a pasture bounded by woods.
Frank opened the rear door of the camper. “Do you need a hand?” he asked the paramedic.
“Yeah,” he replied, pulling the gurney halfway out and eyeing the width of the door. “I want to keep him strapped in the gurney until he’s fully conscious. That way he can’t roll off it. You take that side.
“Great,” Taylor said, once the gurney was inside. “Give me a few minutes to figure out how I’m going to tie this down.”
“Take your time. We’ll settle in up front.”
Shannon checked out the camper when they climbed inside. “So how much is this like the rig you blew up in the book?”
“To be fair, I didn’t blow it up. The CIA did. I guess you could say I did put it in harm’s way, though. Anyway, it’s the same basic model, but I had it customized a bit. I hope George had someone put new license plates on this morning. Hang on while I check.”
“We’re good to go any time you are,” Taylor said when Frank was back in the driver’s seat.
“Great,” Frank said. “And we’ve got our new official identity as well. Never occurred to me I’d be a Hoosier one day.”
Frank turned the camper around and headed back up the lane. “So,” he said, turning to Shannon, “are you ready for a little adventure?”
“Didn’t we just have one?”
“I guess. But my bet is that was just the beginning.”
21
Road Trip!
Shannon had never seen so many farms in her life. With the hard-scrabble spreads of Appalachia behind them, they were sailing along arrow-straight roads through the corn and soybean fields of the Midwest. The biggest concern they had was staying invisible to Turing, and that meant staying as far away from sensors and cameras as possible, new license plates or not.
“You know,” Shannon said, “I always wanted to drive cross-country. But I never thought it would be under circumstances like these.”
“It’s fun, although it’s more fun the first time than the fourth.” He called back over his shoulder. “You okay, Jerry?” There was no reply. “Can you check on him?”
Shannon turned around and craned her neck. “Yeah, he’s okay. He just has his headphones on as usual.”
They had dropped Taylor off at a bus depot and tossed the gurney into a trash compactor at a transfer station an hour before. Shannon was flipping through a care plan for Jerry that Taylor had left behind.
“Boy, does this guy take a lot of medications.”
“Like what?” Frank asked.
“Well, let’s see. Statins – that’s a cholesterol drug. Small aspirin. That’s for heart health, too, I think. Vitamin D supplement – well, that one’s obvious. I don’t know what the rest of this stuff is or what it’s for. He’s got an insulin pump for diabetes, too. Zack confirmed the se
ttings are fine and left a thirty-day supply in the refrigerator.”
“Thank goodness, he has an insulin pump,” Frank said. “Jerry doesn’t seem like the kind of guy who would remember to eat, much less monitor his glucose level. Speaking of which, are you hungry?”
“Famished.”
“Okay. Let’s find a place to park.”
A roadside pull-out next to a stream came along a few miles later. Jerry pulled his headphones off when Frank tapped him on the shoulder. “Lunch?” Frank asked.
“Okay – and don’t forget – you promised to tell me what we’re doing in this vehicle.”
“Absolutely. Ham and cheese or tuna fish salad?”
“Do you mean sandwiches?”
“Sandwiches – right. Which would you like?”
“Why, I don’t know. I don’t usually eat lunch.”
“Ham and cheese it is then. Why don’t you take a seat at that picnic table over there? Shannon and I will join you in a few minutes with the food.”
Jerry stood up and paused at the door of the camper, squinting in the bright sunshine streaming inside. “Out there?” he said.
“Yes. Don’t worry, we’ll be there soon.” Frank eyed Jerry as he walked uncertainly across the grass. “Get a load of this,” he said, nudging Shannon. They watched as Jerry stared at the picnic table for a moment before awkwardly swinging one leg over the bench seat. He grabbed desperately for the edge of the table when he lost his balance. “He looks like a house cat that’s found itself outside for the first time in its life.”
When they joined him, Jerry was hunched over and fidgeting. “What in the world is that?” he said, pointing toward a pasture across the stream.
Shannon and Frank looked at each other. “What’s what?” Frank said.
“That enormous animal.”
“You mean the cow?”
“Oh!” He looked at it with genuine curiosity now. “My goodness. How extraordinary!” Then he turned back abruptly. “Now please tell me - why am I here?” he asked.
“Right,” Frank said, setting a sandwich in front of him. “What’s the last thing you remember before you woke up in back?”
“Well, let me see. I know I was sitting at my desk, but then I usually am.”
“Do you remember what you were working on?”
“Hmm. Let me see. My big task this week was finishing up the Turing Nine alpha release and getting ready to move it over to my developer team.” He gave the sandwich a curious glance and picked it up. “So, I guess I was working on a memo assigning development tasks to my staff. And then, all of a sudden, I’m waking up in this – what did you call it – this ‘camper.’”
He took a bite and began chewing it reflectively. It struck Frank that Jerry bore an astonishing resemblance to Thor consuming a lettuce leaf.
“What an extraordinary experience,” Jerry concluded. “Now, would you please explain to me what happened?”
“Well, I’m sure it wasn’t personal, but we believe Turing Nine tried to kill you.”
“What?”
“We think Turing Nine reprogrammed your defibrillator to fire every few seconds until it killed you. Luckily, the battery ran down before it did. You were in a hospital overnight. That was two nights ago.” Frank stopped, wondering how Jerry would take the news.
After a long pause, Jerry grinned and said, “What?”
It took a while, but eventually Jerry had absorbed most of the details of what had happened. But he still couldn’t imagine why he had been attacked.
“But why would Turing want to kill me? I created it! It wouldn’t exist without me! What was it I did to make it mad?
It was Frank’s turn to pose the same question. “What?”
“What about what?” Jerry said.
“What do you mean, make it mad? It’s a computer program.”
“Just because it’s a computer program doesn’t mean it doesn’t get mad.”
Frank massaged his forehead. Already they were heading down a Jerry-land semantic rat hole. He struggled for a more useful way to progress the exchange than simply saying “Oh no, it can’t.”
“Can you help me out with that, Jerry? Computer programs are just machines that execute logical instructions. They don’t have emotions.”
“Well, they do if you program them to have them.”
“Why on earth would you want to do that?”
“It’s the other major innovation I added with this release, along with Recursive Guess Ahead. The purpose is to more closely approximate human intelligence.”
“What do emotions have to do with intelligence? I thought they just got in the way?” Frank said.
“Oh, well!” Jerry said. “I quite agree with you. I’ve certainly never found any use for emotions; I’m not sure I even have any. But they must provide a survival advantage, or evolution would have eliminated them eons ago. And ensuring our survival is at the core of Turing’s mission. Not to mention that a program that gets destroyed can’t complete its mission.”
Shannon frowned. “How would emotions protect us?”
“Well! For one thing, they’re a great way to reorder priorities. Let’s say you see a tiger in a zoo inside a cage. That’s not very important information, from a survival point of view, so you might just stroll by. But let’s say you see a tiger outside a cage. That information could be very important! And it would be important to act upon it very quickly. Instinctive fear reorders priorities instantly. It also invokes immediate responses that evolution has shown to be helpful in dangerous situations, like a quickened heartbeat and an increase in adrenaline production. Traditional computers are getting better at identifying situations and selecting actions to take in response, but selective urgency isn’t a concept we usually consider in the context of programming.”
“Well, that’s very intriguing,” Frank said. “I don’t recall anyone ever suggesting programming emotions into computers before.”
“Oh, my goodness, that’s not so. People have been talking about how to emulate emotions in computers for decades. There’s quite a body of literature on the topic. Marvin Minsky, for example, expressed his belief thirty years ago that computers wouldn’t be able to become intelligent without emotions.”
“That’s very interesting,” Frank said. “How exactly do you go about adding emotions to a computer?”
“The high-level concept is fairly simple. First, a computer needs to be able to recognize events, conditions, or situations that would trigger an emotional response in a human being – like the sight of a tiger outside its cage. Then it compares that data to a list to see what emotion, or emotions, that information would trigger and logs that information in. If the information from that, and any other inputs, passes a predetermined threshold, the response is triggered. Of course, that means you also need a list of behaviors the computer is supposed to demonstrate when the emotion is triggered. So, for our tiger example, the list might include ignoring other stimuli and focusing exclusively on data relating to escaping the tiger situation.
“Putting that into practice, especially in a realistic way, however, is very difficult. Most of the work to date has been theoretical. Not many emotional AI programs have been created yet, and those that have had very narrow applications, like making a storytelling program read more realistically or instructing a human-like robot when and how to exhibit facial expressions appropriate to the moment.”
“That’s fascinating. How did you decide to go about it?” Frank said.
“To begin with, I focused on the fact that human emotional responses originate in very primitive parts of the brain. So, if I wanted to realistically emulate the impact of emotions on a computer, I’d need to figure out how to start the process at the lowest level of the program’s logic.
“Then I tried to le
arn as much as possible about what we know about emotions. The first thing I found is that there isn’t consensus on what emotions are. Experts don’t even agree on how many distinct emotions exist. Or what names to give them. Dr. Eckman identified six basic emotions – anger, disgust, fear, happiness, disappointment, and surprise. But Dr. Plutchik thinks there are eight, paired as opposites: joy versus sadness; anger versus fear; trust versus disgust; and surprise versus anticipation. Everyone agrees there are lots of lesser emotions, like envy, pride, lust, and so on. As well as moods and temperaments. It’s really quite complicated and undefined,” Jerry sniffed.
“My first decision,” he continued, “was to simplify things as much as possible. I started by sorting all commonly recognized emotions into three categories: the first group included emotions I decided were relevant and potentially helpful for computer intelligence. In the second group, I put the emotions I considered relevant to AI in the sense that they could be harmful to effective computer intelligence. I assigned all the remaining emotions to the third group, which, of course, was meant to include emotions I concluded would rarely have a positive or negative impact on machine intelligence.
“The relevant-slash-positive category proved to be the shortest, which was helpful. It included only confidence, courage, curiosity, distrust, empathy, and fear.
“I assigned annoyance, anger, anxiety, apathy, boredom, contempt, despair, envy, and many more into the relevant-slash-negative group.
“That left the irrelevant ones, such as affection, anguish, anticipation, awe, contentment, disgust, et cetera. But, of course, if you really want to emulate human thinking, then you can’t ignore any emotions, because how can we really know what is relevant? So, by creating the third category, I was mostly just deciding which emotions I would return to later.”