Crucible (Sigma Force #14)

What if they’d already started?

She finally dissected out three dozen unique microkernels, thirty-six data points of Eve’s digital fingerprint. She copied them, uploaded them into Orange’s search engine, a system already designed to scan, debug, and monitor its network.

She sat back, watching the meter running along the top of the screen, picturing her code coursing through Orange’s server farms, both those buried under this tower and others spread throughout the globe.

As she waited, she stared out the windows that overlooked the dazzling tapestry of wintry Paris. Though it hadn’t snowed, an icy fog had rolled in from the Seine, misting the city lights into a hazy illusion of itself, as if Paris were a dream vanishing into the night. Yet, above it all, thrusting out of the mist, the Eiffel Tower glowed like the last beacon of the dying city.

Mara shivered at this thought, fearing such a fate might still come true.

A chime sounded from the computer, announcing the completion of her scan. She read the results: 0.00% MALICIOUS FILE MATCHES. She closed her eyes and sighed.

All clear.

Jason nudged her shoulder, reading the same. “So, the Crucible hasn’t attempted to upload Eve into Paris’s systems yet.”

“No,” she conceded, then qualified her statement. “That’s assuming this digital fingerprinting even has any efficacy. We may be wasting our time here.”

Jason leaned down and tentatively placed a hand on her shoulder. “Quit second-guessing yourself. Your methodology is sound, brilliant, in fact.”

She glanced up to him, noting his dimples, the light scruff of blond beard over his chin and cheeks. “Thanks.”

He grinned back at her. “Of course, now comes the hard part.”

She frowned, returning her attention to the screen, wondering what he meant.

“The waiting,” Jason clarified. “Because this will work. If the Crucible makes any attempt to corrupt Paris’s infrastructure with your program, we’ll know it.”

Mara took a deep breath, drawing confidence from his firm assurances. “The scan will continuously run from here. If it detects malicious code that matches any of the thirty-six data points I uploaded, we’ll be notified immediately.”

Still, a larger anxiety ate at Mara, one fraught with guilt. As she stared at the screen, at the spinning wheel of the ongoing scan, she voiced it. “I should never have built Eve. What was I thinking?”

“If you hadn’t done it,” Jason assured her, “someone else would have. And maybe it’s best it was you.”

“Why me?”

Jason stepped to the desk, sat on its edge, and swiveled her chair to look more directly at her. “I studied your design. The architecture of the Xénese device is brilliant, from the cobbling of Google’s quantum drive to your incorporation of chameleon circuits.”

“Chameleon circuits?” Carly asked.

Mara explained, happy for the distraction: “They’re logic circuits that can switch function on the fly, even repair themselves.”

“It also makes the system infinitely more versatile,” Jason said. “It’s fucking genius. If you’ll excuse my French.”

“Well, you are in France.” Mara allowed a smile to form, the first in what felt like months. “So I guess it’s okay.”

Jason matched her grin. “And that versality of function allowed you to program uncertainty into your creation.”

Carly frowned. “I don’t understand. Why would you want Eve to be uncertain?”

Jason began to explain, but Carly cut him off with a raised palm, looking to Mara instead.

Mara took up the gauntlet. “Uncertainty is a key aspect of human reasoning. Without uncertainty, we would never doubt ourselves or our decisions. We would be certain that we’re right all the time. It’s this certainty that can make an AI’s ability to learn turn brittle over time. But if an AI is uncertain and capable of doubt, it can begin to judge itself, to question whether an action or decision will have the consequence it desires and test it more thoroughly. In this way, it begins to understand probability—specifically the convoluted relationship between cause and effect.”

Jason nodded. “This means—”

“I know what it means,” Carly snapped. “I don’t need you mansplaining it to me.”

Mara tried to intervene. “I don’t think Jason meant it that way.”

Her attempt at appeasement only sharpened the irritation in Carly’s eyes.

“Whatever,” she said.

Jason tried to change the subject. “I think we got off track. Mara, a moment ago, you questioned whether you should have risked creating Eve in the first place. It’s best you did.”

“Why?”

“Otherwise, you might have doomed yourself.”

“Doomed myself? How?”

“Have you heard of Roko’s Basilisk?”

Mara shook her head and glanced to Carly, who shrugged and clearly refused to admit the same. Still, curiosity drew her friend closer to her side.

Jason sighed and rubbed his chin. “Then perhaps I should leave this alone. I could cause you harm if I explained . . . and on top of that, I definitely don’t want to be caught mansplaining again.”

He looked pointedly at Carly with a ghost of a smile. Mara couldn’t help but smile back, captured by his teasing manner.

“Fine,” Carly huffed out. “What the hell is Roko’s Basilisk and why shouldn’t we know about it?”

“Okay, but remember, you were warned.”


10:18 P.M.

Carly kept her arms crossed, still irritated with this guy. She couldn’t explain why he so irked her, but he did. Sure, he was cute and his manner easygoing, but she and Mara had been attacked at the airport, ambushed at her hotel, and kidnapped at gunpoint, only now to be babysat by some covert U.S. paramilitary team, which included this self-assured tech expert.

Who wouldn’t be pissed after all of this?

Apparently, Mara.

Mara had quickly glommed on to this guy: whispering with him on the car ride over, talking shop, comparing technical notes. Like they were already the best of friends. Carly also noted Mara’s shy smile, the way she brushed aside strands of her dark hair to cast sidelong glances his way.

Both possessive and protective of her friend, Carly wished he’d leave them alone and join the others of his group. Her annoyance flared as Mara reached over and touched his knee while he leaned on her station’s desktop.

Carly stared at her hand, remembering the soft heat of her friend’s palm on the car ride over here. Mara stared up at the guy from her seat, an amused grin playing about the gentle bow of her lips.

Mara spoke, giving her consent, too. “Okay, I’ll take the chance. Tell me about Roko’s Basilisk.”

“It was a thought experiment that popped up on a website run by a tech expert in the Bay Area, Eliezer Yudkowsky.”

Mara dropped her hand, her eyes going wider. “Yudkowsky?”

“You know him?” Jason asked.

Mara turned to Carly. “Remember when I told you about the AI Box Experiment?”

She nodded. “When some guy pretended to be a supercomputer trying to convince its gatekeepers to let it out of its digital box?”

“Exactly.” Mara brightened. “The guy who played that supercomputer, who was able to talk his way out of the box each time, that was Yudkowsky.”

Carly frowned. “Okay, but what’s this thought experiment on his website?”

Jason explained, “It posits that a superintelligent AI will undoubtedly come into being and quickly grow into a godlike intelligence, capable of nearly anything. One of the primary drives of this new AI god will be to strive for perfection, to better itself, to improve its surroundings.”

Mara nodded. “That’s pretty much what most experts expect could happen if we’re not careful.”

“Right. This is the Basilisk, the monster of this story,” Jason said. “And since this godlike AI is wired to make things more perfect, it will judge anything or anyone that thwarts this central drive to be an enemy. This includes anyone that tries to stop it from coming into being in the first place.”

“Even us,” Carly said, intrigued despite herself.

“Especially us. It will know humans very well and it will know we are motivated by fear and manipulated by punishment. So to discourage humans in the future from trying to stop or interfere with its programming, it will look to the past, judge those who attempted to stop it, and torture them.”

“To make an example of them,” Mara said.

Carly frowned. “But what if those people are already dead in this future scenario?”

“Doesn’t matter. That won’t stop this Basilisk. Being an omnipotent god, it will resurrect past miscreants. It will create perfect simulated copies, avatars that will think they are you—and the Basilisk will torture them mercilessly for eternity.”

Mara looked sick. “A digital hell.”

“But remember, this perfection-seeking Basilisk will be quite meticulous during its judging process. It will not only seek to punish those who actively seek to stop it. It will also decide that anyone who doesn’t actively help it come into being in the first place should be equally worthy of this same punishment.”

James Rollins's books