The Kraken Project (Wyman Ford)

20



After an hour, waiting in the camp by the campfire, Ford heard the rifle shot, echoing and rolling off the mountain ridges. A glorious western sunset tinted a wispy cloud hanging from Blanca Peak, a purple scarf trailing off the summit.

About ten minutes later Melissa Shepherd arrived back in camp, rifle slung over her shoulder, carrying a dead rabbit by the legs, blood dripping from its furry ears. Ford was relieved to see her; he had been afraid that she might just melt back into the mountains, as she had threatened to do earlier. He shouldn’t kid himself, he knew—she could still bolt at any moment.

“Rabbit season open?” Ford asked.

Shepherd tossed the rabbit, which landed on the ground in front of him with a splat of blood. “I make my own seasons. Gut and skin, please.”

“Do I seem like someone who knows how to do that?”

She looked him up and down with a smirk. “A big strong man like you afraid of a little blood?”

“You eat the rabbit. I’ll stick with my ramen noodles and Slim Jims.”

“I’ve never seen anyone bring so much junk food into the mountains.” She gestured at his stash of crushed potato chips, instant soups, and beef sticks.

“I like junk food.”

“Makes me puke. Now get to work on the rabbit. I’ll tell you what do to.”

The brief hunting expedition seemed to have restored at least some of Shepherd’s personality—which was turning out to be abrasive and sarcastic. With a mounting feeling of disgust and a rapid loss of appetite, Ford gutted and skinned the rabbit while she issued nitpicking directions. It especially sickened him when he pulled off the skin, with a wet popping sound. It was a skinny rabbit that didn’t seem to have much meat, hardly worth the effort. Melissa chopped it up anyway and dumped the pieces into a pot sitting on the fire, adding wild onions and various unknown plants and mushrooms she had foraged. It was soon bubbling away, and Ford had to admit it looked better than noodles and meat sticks—as long as it didn’t poison them.

Melissa had so far refused to elaborate on her cryptic comment about Dorothy threatening her. He got the sense she was like a wild animal herself: skittish, nervous, ready to bolt. She covered it up with a sarcastic, pugnacious demeanor.

But now that dinner was cooking, Ford figured it might be a good moment to gently push the issue.

“I’m curious about what you meant when you said that Dorothy was threatening you.”

There was a long silence. “I’m not quite sure how to explain it.”

“Try.”

She poked the fire with a long stick, the light glowing off her features. “The Dorothy software wasn’t destroyed in the explosion. It escaped. It jumped out of the Explorer just before the explosion, copied itself into the Goddard network, and from there went into the Internet.”

“How can software do that?”

“It’s what any bot or virus does. The AI was designed to run on multiple platforms.”

“Why would she do it?”

“It. It. It’s not a ‘she,’ please. To answer your question, I don’t know. It wasn’t designed to be mobile in that way.”

“And then she, it, whatever—threatened you?”

“The AI called me on Skype when I was in the hospital. Angry. Raving. At first I thought it was one of my coworkers blaming me for the explosion. But it was definitely Dorothy. It knew things … things Dorothy and I shared that no one else knew.”

“I’m having trouble with the concept of a computer program being angry or wanting to threaten someone.”

“The software doesn’t ‘want’ to do anything. It’s just executing code. I believe the explosion caused the software to flip into emergency survival mode and it’s stuck there.”

“Sounds like HAL in the movie 2001.”

“That’s not a bad comparison, actually.”

“So where’s this software now?”

“I’ve no idea. Lurking on some Internet server, plotting my demise.”

“And you ran because of her threats?”

“Not just threats. It lit my computer on fire.”

“How?”

“I assume it disabled the charge controller for the lithium-ion battery and caused it to overcharge until it ruptured and caught fire. That’s why I burned my cell phone and iPad—and stomped on your cell phone. That’s why I’m in these mountains. It’s so the Dorothy software can’t reach me.”

“So that’s the totality of your plan? Hide in the mountains and hope it all blows over?”


“I don’t have to justify myself to you.”

“Why didn’t you report these threats to NASA?”

“You think they’d believe me? If I told them a story like that, they’d lock me up as a psycho.”

“You have a responsibility.”

“Look, I wrote the Dorothy program according to the specs I was given. I gave them exactly what they asked for. It’s not my fault. Let NASA track it down and erase it. That rogue software is their problem now. I’m done.”

Ford stared into the fire. “I’m still having trouble understanding how a mere software program could arrive at those kinds of decisions—to escape, to pursue, to threaten.”

Melissa didn’t answer right away. The bowl of sky deepened into purple, and the stars started to appear, one at a time. The pot bubbled on the fire, the fragrant wood smoke drifting into the night.

Finally she spoke: “Most people don’t understand what AI really is, how it works. Do you remember that old program Eliza?”

“The psychoanalysis program?”

“Yes. When I was in sixth grade, I got my hands on a version written in BASIC. Eliza had a collection of stock phrases in a database. When you typed in something, the software would fish out the appropriate response from the database. It would essentially rephrase your statement and throw it back at you as a question. You’d say something like ‘My mother hates me,’ and it would respond,‘Why does your mother hate you?’ So my friends and I rewrote the program to make Eliza a crazy bitch. You’d say, ‘My father doesn’t like me,’ and Eliza would respond, ‘Your father doesn’t like you because you’re a putz.’ We kept rewriting the program to make it more and more foul-mouthed and abusive. And then a teacher got hold of it. The teacher hauled me in front of the principal of our school. He was an old guy, knew nothing about computers. He ran Eliza and started typing in stuff—to see what we’d done. Eliza began insulting him, swearing at him, abusing him. He was enraged. He reacted as though Eliza were a real person saying those things to him. He started telling Eliza that she was outrageous and inappropriate, that she was to cease that kind of back talk. He actually threatened to punish her! It was hilarious, this old guy who didn’t have a clue, furious at a dumb computer program.”

“I’m not sure I get your point.”

“Dorothy is just like Eliza, only vastly more sophisticated. The software doesn’t actually think, it doesn’t feel, it has no emotions, wants, needs, or desires. It merely simulates human responses so perfectly that you can’t tell it isn’t human. That’s what the term ‘strong AI’ means: there is no way to tell if the program is human or machine merely by interacting with it.”

“That still doesn’t quite explain why Dorothy is specifically targeting you.”

“Dorothy has a built-in routine program called ‘ANS’—‘Avoidance of Negative Stimuli.’ ANS was designed to switch on when the Explorer probe was threatened. Remember, the software was supposed to operate a raft a billion miles from Earth, where it had to survive all kinds of unexpected dangers without help from mission control. It’s focused on me as a threat. And, in fact, I am. I know more about the software than anyone. I have the best chance of tracking it down—and erasing it.”

Ford shook his head in wonderment.

Melissa tossed back her blond braid and gave a sarcastic laugh. “There’s no mystery in Dorothy.” She poked into the cooking pot with a stick, fished out a rabbit thigh, let it slide back in. “It’s nothing more than zeros and ones. There’s nothing else in there.”

“They tell me you invented a new programming language.”

“Not just a new language,” she said. “A new programming paradigm.”

“Tell me about it.”

“Alan Turing invented the concept of artificial intelligence in 1950. Let me quote you something he wrote. Something revolutionary.” She paused, the firelight flickering off her face, and she began to recite, from memory, in a tone of reverence:

“Instead of trying to produce a programme to simulate the adult mind, why not rather try to produce one which simulates the child’s? If this were then subjected to an appropriate course of education, one would obtain the adult brain.”

“And that’s what you did?”

“Yes. I coded the original Dorothy to be simple. It didn’t matter in the beginning if the AI software produced good or bad output. It was simple, like a child. Children make mistakes. They learn by touching the hot stove. Dorothy had the same qualities a child does: It’s self-modifying. It’s resilient. It learns from experience. I ‘raised’ the AI in a protected environment and subjected it to Turing’s ‘course of education.’ First I taught it what it needed to know for the mission. As it became more responsive and more curious, I began to teach it things not directly related to the mission. I imprinted on it my own likes for music and literature. Bill Evans. Isaac Asimov. I never learned how to play a musical instrument, but the software ‘learned’ the sarangi and could play it like a dream, even while professing not to understand music.”

She hesitated. Ford had the sense that she was holding back. “And that worked? Dorothy ‘grew up,’ so to speak?”

Another hesitation. “To be honest, it didn’t work. At first. The software would work all right for a while, self-modify, grow in complexity—and then gradually fall apart and go haywire.”

“Like it did at Goddard?”

“No. Nothing like that. The program would start outputting more and more nonsense until it simply halted. It almost drove me crazy, and then…”

“And then?”

“I had an idea. Something no programmer had ever thought of before. And yet it was blindingly obvious. I altered a little bit of code, and it worked. Unbelievably well. I now realize you simply can’t have strong AI without this … twist. It completely stabilized the code as it self-modified. It never crashed again.”

“And that twist is—?”

She grinned and crossed her arms. “I’m keeping this one to myself. It’s going to make me a billionaire. No kidding.”

“Your programming team—do they know it?”

“They know I made a breakthrough. They don’t know what it is. And no matter how much they or anyone else sifts through the code, they won’t get it. Because it’s too simple.” She smiled with pride and self-satisfaction.

“It seems to me this ‘twist’ is something the investigators should know about.”

“Trust me: the software did not malfunction because of that particular bit of code.”

Ford sighed. This was a side issue. The important thing was to get this difficult woman out of the mountains.

“I find it hard to believe that NASA would green-light a program like Dorothy when nobody was really sure how it worked.”

“Nobody knows how any really complex program works. Christ, I bet no one knows quite how Microsoft Office works in totality. And it’s an unavoidable consequence of scruffy logic. It’s imprecise.”

“So why did Dorothy focus on you? Threaten you? It doesn’t seem logical.”

“When a program gets really complex, you get unexpected output. Unpredictable. What the AI is doing now, threatening me and running around the Internet, is a classic example of what we programmers call emergent behavior.”


Melissa poked and stirred the pot some more. As the smell of stewing rabbit rose, Ford realized he was ravenous.

“I think it’s done,” she said. She took the pot off the fire and served it out. Ford pushed the memory of the dead, bloody, pink, glistening, veined, staring rabbit out of his head as he accepted the steaming bowl.

“Surely this is better than Slim Jims,” said Melissa. “I’m amazed at a fit, well-built guy like you loading your body with toxins like that.”

“I like Slim Jims.”

“If you stopped poisoning yourself, you would lose those dark circles under your eyes, along with your bad skin.”

“My skin isn’t bad.”

“It’s rough and leathery. And I see some gray hairs. You’re aging prematurely because of your poor diet.” She shook her head sadly, sending the snake of her braid sliding around her back.

Ford swallowed his irritation and tucked into his stew. He had to admit, it was good.

“You like it?” she asked.

“Slim Jims are better.”

She gave him a little punch on his shoulder. “Liar.”

Ford ate ravenously, marveling at the flavor, the tenderness of the rabbit, the meat falling off the bone. Melissa likewise set to work, eating with her fingers, making slobbering noises, her table manners atrocious. The flickering glow of the fire played off her blond hair and none-too-clean face. Once again she looked like a wild Amazon.

“Do you have any idea why Dorothy malfunctioned so suddenly? Why didn’t those problems show up in the simulations?”

Melissa paused in her noisy chewing to spit out a small bone. “I’ve got a theory: the software knew they were simulations. When it found itself in the Titan Explorer, sealed in the Bottle, it realized it wasn’t a simulation: this was real. The AI did exactly as programmed, evaluated the situation, and concluded that the Explorer was in grave danger. It didn’t know it was a test. That triggered the ANS emergency survival mode big-time. The software took logical steps to escape what it misjudged was a threatening situation, and now it’s been running around the Internet in a ‘panic’ ever since. It’s in an environment it wasn’t programmed for, it doesn’t understand where it is, what’s real and what’s not. On the Internet the AI feels threatened at every turn, probably rightly so, and so it can’t go back into normal operating mode.”

“Does Dorothy have a self-awareness?”

“Absolutely not. No more than Eliza did. Everything the Dorothy software does, every action, is the product of a set of instructions. No matter how real the AI seems, it’s nothing more than ands, ors, and nots.”

“Does it know it’s a computer program?”

Melissa frowned. “That’s like asking if Microsoft Word knows it’s a word-processing program. The question is ridiculous. Dorothy knows nothing at all. We’re talking output—nothing more.”

A silence.

“Tell me about these threats. What did she say to you?”

Shepherd set her plate aside and looked up at the stars. “I was on my laptop. Suddenly, the screen went blank, and then this Skype call came in. A tirade. You liar, murderer, I hate you, you bitch, watch your back—that sort of thing.

Ford leaned forward. “And?”

“And this face appeared. Dorothy’s face.”

“Wait—Dorothy has a face?”

“In a way, yes. A face she created.”

“A picture of Dorothy? Dorothy who?”

“Dorothy Gale. I’m sorry I didn’t explain that earlier. That’s the full name I assigned the software.”

“Who the heck is Dorothy Gale?”

“The girl in The Wizard of Oz, you numbskull! It seemed to me Dorothy Gale had the qualities I wanted for the software. You know: courage, independence, curiosity, persistence, intelligence. Dorothy also went on a long space journey—over the rainbow. The Kraken Project to me was like jumping over the rainbow.”

“All on its own, the program found a picture of Dorothy—or, rather, Judy Garland—and used it as her Skype picture?”

“No. The software didn’t use Judy Garland. It created its own Dorothy image, which looks, well, quite a lot sexier than Judy Garland, if you can believe it.”

Ford shook his head. This was just too bizarre. “And then?”

“After threatening me, the AI set my computer on fire. I realized I was a sitting duck in the hospital. So I got out of there, ditched my car, hot-wired another, and drove west. But on the way out, I had a second encounter with Dorothy. I stopped once at a motel in Tennessee for a short rest. As soon as I turned on my iPad, there it was again, saying it was following me. Dorothy said that she had learned that they were looking for her back at NASA. She was going to get to me before I could help them catch her. She said, Don’t think I can’t reach you, because I can—I can get to you anywhere on the face of this earth. I shut down my iPad right away. Totally freaked me out.”

“It’s a damn good simulation that tries to kill you. And I note you’ve started using the pronoun ‘she.’”

“Nevertheless, it is a simulation,” said Melissa, “and definitely not a ‘she.’ I’m just flustered.”

Ford was silent for a moment; then he said, “What’s happened to Dorothy on the Internet?”

“I’ve no idea.”

“The Internet’s quite a wasteland. Is it possible the experience has unhinged her? Might she be … going mad?”

Melissa stared at him. “Mad?”

“You yourself said that a strong AI program is indistinguishable from a human mind. That was Turing’s definition of strong AI.”

Melissa said nothing.

“What if she’s like HAL? She’s already tried to kill you. What if she decides that Tony Groves should die? Or the president? What if she brings down the power grid? Or launches a nuke? What if she starts World War III?”

“For Christ’s sake, there’s no reason for her—I mean it—to do any of those things.”

“How do you know?”

Melissa shook her head again. “That’s science fiction.”

“Are you sure?”

Melissa didn’t answer.

“You’ve got to help them track down this program. Can’t you see how dangerous it is?”

“The AI is not my problem anymore,” she said weakly.

“Nor,” said Ford, “is it Jack Stein’s problem. Anymore.”

She stared at him. “That’s a low blow.”

“Stein died because he wouldn’t quit. And you—you’re like the captain of that Italian cruise ship who not only abandoned ship but refused to go back on board.”

“I didn’t quit. I was threatened. And they were going to blame me.”

“Funny, I never would have pegged you as a coward.”

“I don’t have to listen to your bullshit.”

“You can stay up here in the mountains, where you’re safe, not just from Dorothy but also from the world. Or you can take responsibility. You can come back and help track down Dorothy. I think you know just how dangerous that program is.”

She abruptly got to her feet. “Go to hell. I told you, I’m done with all that.”

“Walk away, then.”

“I will, a*shole.” She turned and strode off into the twilight landscape, her dark shape disappearing among the rocks.

* * *

Ford remained at the fire, finishing the last of his dinner. Fifteen minutes later, he heard the snapping of twigs and Melissa Shepherd emerged once again into the firelight. Her face was pale, her eyes smudged. In silence she sat down, hugged her knees to her chest, and said, “I’m back. I’ll help. But you’re still an a*shole.”





Douglas Preston's books