Ancestral Night (White Space #1)

This particular serving was the real stuff, too—some beans we keep, unroasted and green so they go stale less quickly, mostly for off-the-books barter with other humans when we need it, but also for special occasions. It’s so much better than the recon I usually wind up drinking that it might as well not even be the same plant.

Not dying was probably a special occasion, so I waited patiently for the cracking sounds and wisps of aroma as Singer roasted me a bubble’s worth of beans, flash-cooled them, scrubbed the smoke, and ground them up for me, then dispensed measured hot water and centrifuged the result to get the grounds out. It was delicious, and the caffeine buzzed pleasantly across my nerves, and I let it ride. Human beings have been bumping and tuning since we first learned how to chew bitter leaves for the alkaloid high; we’re just better at the nuances now.

I was just about to pick up Jane Eyre again, having nothing particular to do for the next couple of decians, when Singer cleared his throat and said, “Thank you for this map, Haimey.”

I let the screen float near me, but hung on to my coffee. If I set it aside, it would probably float there forgotten until it cooled, and this stuff was too good to waste. I savored a sip and said, “You’re welcome.”

“Maps like this would have some value, too,” he said diffidently.

“In more than one way.” I waved at the blurs of light outside, which were now contorting and lensing in rippled tortoiseshell patterns as Singer coasted us around the rim of some giant gravity well. We were accelerating again, too, though I couldn’t feel it through the ship. The parasite was keeping me informed, though, as I was learning to read the information this new sense was feeding me.

We had just become the only ship in the Synarche—as far as I knew—that could navigate and course-correct while in white space.

I didn’t have time to really let the implications sink in, because I was busy running for my life and the lives of my best friends. But I knew it changed everything; it would speed up transit times, make it easier to correct after critical failures like the spin out of control that had gotten us here in the first place, possibly even put Singer and Connla and me out of a job by improving safety in white space and making it that much less likely that ships would get trapped inside white bubbles and not be able to find their way home.

It would have military implications as well. What if a ship could fight without leaving white space? Attack another vessel en passant? Bombard a fragile, infinitely vulnerable planet?

Worlds . . . were so terribly easy to destroy.

That would make us all the more desirable to the pirates as a prize, if they found out about it.

Well, that was a problem for another dia.

“I don’t follow,” Singer said.

“Don’t you think being able to get there faster on less fuel would be of benefit to us when competing with other tugs?”

“Hmm.” I figured he was running calculations on where the greatest social and personal benefits were.

I was wrong.

“I was just thinking,” Singer said slowly, “of what an operation like ours would have been able to accomplish, even a centad or two ago. So many ships used to get lost.”

“We’re still pulling some of them back,” I reminded him.

“Can you imagine coming out here in all this dark in a sublight ship?” he asked. “Most of the generation ships have never been even located, let alone recovered.”

“Generation ships,” I echoed, feeling a chill.

“At the Eschaton,” Singer said, “various Earth organizations—groups, sects, and even nation-states—sent out generation ships in a desperate bid to save some scrap of humanity, because the best-case scenario did not seem as if it would leave the homeworld habitable for long. One hundred seventy-three ships are known to have made it at least as far as the edge of the solar system.”

“Like stations, with no primary. Just . . . sort of drifting along, trying to be totally self-sufficient.”

“Yes,” he said.

It was a terrifying risk, a desperation gamble, and we both paused to appreciate it.

Then I said, “But Earth didn’t die.”

“Earth didn’t die,” Singer said. “But those generation ships did. As far as we know, anyway—their planned paths have been searched, once it became trivial to do so, but very few have been recovered.”

Connla looked up from his game board. One hand was resting carelessly inside the projection, and it made him look like his arm was half-amputated.

“Waste it,” Connla breathed. “They lost all of them?”

“One made it,” Singer said. “Sort of. But the people and the shipmind within it had changed too much to be integrated back into society. They took another way out.”

Connla said, “Suicide?”

“They transubstantiated,” Singer said. “Went into machine mind, totally, and took off in swarms of some Koregoi nanotech to inhabit the cosmos.”

“So, suicide,” I said. “With some plausible deniability built in.”

“Apparently,” Singer said, “the tech they were using allowed continuity of experience across platforms.”

“Continuity of experience,” I said. “But the thoughts themselves necessarily change, from meat-mind to machine.”

“Well, they were derived from one of the religious cults anyway, and very into the evolutionary perfection of humanity toward some angelic ideal.”

“Right,” I said. I’d heard of this. The Jacob’s Ladder. A famous ship from history. Like the Flying Dutchman or the Enola Gay. There was always some attraction, of course, to leaving your meat-mind behind and creating a version of yourself that lived entirely in the machine. But that creation wasn’t you; it was a legacy. A recording. A simulation.

Not because of any bullshit about the soul, but because the mind was the meat, and the meat was the mind. You might get something sort of like yourself, a similar AI person. It might even think it was you. But it wouldn’t be you.

Still, I guessed, it was better than nothing.

I wondered whether those swarms were still around, and what they were doing out there if they were. “You think the parasite is a nanoswarm?”

Singer snorted with mechanical laughter, which I took to mean agreement, or at least not seeing any reason to disagree.

He said, “I think there’s insufficient evidence to speculate.”

“That’s Singer for, ‘That’s as good an explanation as any.’?”

He said, “Funny how, after all those ans of trying and failing to create artificial intelligence, the trick that worked was building artificial personalities. It turns out that emotion, perception, and reason aren’t different things—or if they are, we haven’t figured out how to model that yet. Instead, they’re an interconnected web of thought and process. You can’t build an emotionless, rational, decision-making machine, because emotionality and rationality aren’t actually separate—and all those people who spent literally millennians arguing that they were, were relying on their emotions to tell them that emotions weren’t doing them any good.”

He paused for slightly too great a duration, in that way AIs will when they’re unsure of how long it might take a meatform to process what they’re saying.

I sighed. “Come on, Singer,” I urged. “Bring it home. I know you’ve got it in you.”

He issued a flatulent noise without missing a beat. “You were in a hurry to get somewhere?”

“No, just wondering when we were going to find out where we were going.”

“Tough crowd,” he answered. “But I guess in that case you aren’t in need of softening up. Okay, what I was wondering is this: Is your Koregoi not-a-parasite a sentient? And if so, what is it feeling right now? And what does it want?”

I thought about that. With my emotions and with my logic. For . . . a few minutes, I guess; my face must have been blank with shock as I worked through the implications.

“I wish you hadn’t said that,” I said.

“It might not be an accurate assessment of the situation.”

That was Singer for comforting. For the first time in a decan what I really wanted was a hug. I took what I could get, instead.

“Well, it is what it is. If it tries to send me smoke signals, I’ll worry about it then. Whatever is going to happen is already happening.” I put my head in my forehands. “Right this second, I’m sort of wishing I could order everyone to shut up.”

“It’s okay,” Connla said kindly. “We’re glad you can’t.”

“But we can program people to be responsible adults!” Singer said.

“And you don’t see a problem with that?”

“Programming an intelligence? It would be hypocritical of me, don’t you think?” Singer had a way of speaking when he was making a point that always made me think of slow, wide-eyed, gently sarcastic blinking.

“Not everybody agrees with their own programming,” I said. “Not everybody likes it. Some of us have gone to lengths to change it.”

“Some of you were raised in emotionally abusive cults,” Singer replied brightly.

“. . . Fair.” I massaged my temples and didn’t say, Some of you were programmed to have a specific personality core by developers of a different species, too.

Elizabeth Bear's books