Ancestral Night (White Space #1)

? ? ?

Not the sort of good you want to get hooked on, however. Winning conversations is fine every once in a while, but getting in the habit of always having to win them is a hell of a way to run any relationship that isn’t already based on mutual antagonism. I told myself that I wanted Farweather off balance, and that it didn’t hurt if I could find ways to make her eager to impress me. I wished I knew more about neural programming and how to get people to do what you wanted without rightminding when you had to work on their preconceptions and patterned behaviors rather than more self-aware sets of motivations.

I mean, not that Farweather was entirely un-self-aware. She wasn’t childlike. She was just . . . self-justifying in funny ways.

Which made me wonder if I, too, was self-justifying in funny ways. Protecting my preconceptions. Defending my internal structures rather than being willing to challenge them.

Maybe I was a nice, safe little puppet of the Synarche, or Justice.

Or maybe I was a person who valued community and the well-being of the mass of sentient life over the individual right to be selfish. And I mean, that—by itself—was the one overarching and unifying belief that made the Synarche possible. I was free to be whoever I wanted, do whatever I wanted, as long as it wasn’t harmful to or exploitative of others, or profligate with resources. I would be assured livelihood and health and housing, and if my efforts benefitted the community, I would be allotted resources to pursue them.

But if I was needed to serve for a time, I was expected to serve for that time. And not everybody—for example Singer—came with that essential freedom installed; AIs were expected to serve first, and earn their freedom later.

That didn’t seem exactly right to me. But the resources to create them had to come from somewhere, didn’t they?

The resources to create me had come from somewhere, too. Which is why I owed the Synarche service if it needed me. But I didn’t have to pay off a debt just for existing. . . .

It was complicated. Maybe there are no really fair systems.

I don’t know.

? ? ?

I didn’t sleep well. Even when I tuned the anxiety out, my brain wouldn’t be quiet: I was too deep in problem-solving mode to stop myself from assessing, contemplating, nagging at the relentlessly uncrunchable data.

If only intractable, nuanced, convoluted problems had simple linear solutions with a right and wrong answer, amenable to a little logical consideration. Of course, if that were the case, the entire course of human history would be different. And we probably wouldn’t need a Synarche, because any idiot could figure out what to do in any given circumstance.

I could have made myself sleep, but honestly I felt that my brain needed the time to work, and if I slept, I wanted it to be the chemically uncomplicated sleep that would allow my subconscious to keep plugging away at the problems it was chewing on. I knew letting Farweather at my brain—at my machine memory—was a bad idea. A catastrophically bad idea.

But she’d gotten to me, after all.

Who was I? What had I done?

Who had I been, if I wasn’t who I thought I was?

Or was she completely full of lies, saying anything she thought of to get me to wander into range? That was the most likely explanation, quite frankly. Probably everything she was telling me was balderdash. She knew an awful lot about me, though. Enough that I still suspected that she’d known it for much longer than she was admitting.

We’d all been traveling nonstop since we encountered each other near the murdered Jothari ship, so there wasn’t time for her to have researched me unless the information was already easily available to her. Information takes a long time to get from place to place. Nearly as long as people do. All the evidence pointed to our presence there having been part of some complicated plan.

Maybe they’d fired to disable our ship rather than destroying it on purpose. If Singer hadn’t popped us into white space, it’s possible the next shot would have taken out our coils, and then we would have been at their mercy. We’d have had no choice except to surrender.

What would have happened to Singer then? The Freeporters hated artificial intelligences as abominations. Would they have just left him adrift in space? Would they have destroyed him once they’d retrieved me, or me and Connla?

Then I remembered that Singer and Connla were dead, and it hit me like a gravity whip again. Obviously, I was not doing a very good job of processing my grief. I needed time to mourn. What I had was . . . a lot of crazy ideas that sounded more than a little narcissistic to me when I stared at them for too long.

It made me feel like a nasty, suspicious, slightly off-kilter conspiracy theorist, but I couldn’t help but wonder again about that tip that had sent Connla and Singer and me out to the disabled Jothari ship to begin with. The timelines really didn’t make sense: Why hadn’t we gotten there to find the Jothari ship already claimed by the pirates and removed? That only made sense if we’d gotten our hot tip on the location of fresh salvage before Farweather had murdered the Jothari ship. Or if we’d gotten it after, but the pirates had waited for us to get there.

Could they have been waiting for their own salvage tug? It took specialized skills to retrieve a derelict from white space, but they had to have Freeport salvage operators, right? How else did they manage to pirate, for crying out loud? And if they’d wanted Singer to use as a tug, they wouldn’t have shot his damned boom off, would they?

Conspiracy theories are really attractive. Figuring out patterns is one of the things that gets your brain to give you a nice dose of chemical reward, the little ping of dopamine and whatever else that keeps you smiling. As a result, your brain is pretty good at finding patterns, and at disregarding information that doesn’t fit. Which means it’s also pretty good at finding false patterns, and at confirmation bias, and a bunch of other things that can be fatal. Our brains are also really good at making us the center of a narrative, because it’s what we evolved for.

So maybe I was making things all about me, to a ridiculous level. And yet. If they hadn’t needed salvage operators—specifically Synarche salvage operators—then I came back, again, to the idea that they’d been trying to get their hands on me. Which was not the most reassuring of conclusions, though it certainly did reinforce all my cherished beliefs about the depth of my own importance.

I wanted to know and I didn’t want to know, and I was having doubts about everything from who I was to my most basic memories. I didn’t think she was telling the truth—not all of it, anyway. But she might be telling enough of the truth that I would have something to gain—self-knowledge or something else—from taking it up with her.

It occurred to me that if I opened my fox to Farweather, she’d have to open hers to me. Assuming, of course, that she had one—but I couldn’t imagine anybody getting around in civilized space without some kind of access to senso. How would you open doors, for that matter? Talk to systers? Sure, you might be a humanocentric bigot, but you still needed to be able to talk to other ships and stationmasters (Habren, anyone?) if you were going to have anything to do with civilized space at all—and at least some Freeporters patently did so.

It also occurred to me that the answers to just about everything I wanted to know were probably sitting right out there in the open, shelved neatly in Farweather’s machine memory. Assuming she had machine memory.

This was a terrible idea. No justification I came up with was going to change that.

And yet, it was an idea that I kept having.

? ? ?

“How can I believe what you’re telling me?” I asked, sitting down on my own mattress, bleary-eyed. I felt terrible: sweaty, complicated, as if my skin were borrowed and also itched abominably. Farweather, despite being chained to the stanchion by one ankle, managed to look cool and tidy, except where her hair was tangled and greasy. Mine was growing out long—by my standards, anyway—and had started forming a wooly puff that tended to get flattened on one side from sleeping under gravity, when I didn’t go climb into my access tube and float free and comfortable.

She said, “Of all the things I am, I’m not a liar, Haimey.”

“I’m still collecting data on that, thank you.”

This is a terrible idea.

What would Connla do?

He’d tell you it was a terrible idea.

And then?

And then he’d probably decide to do it. Just to see what happened.

The still, small voice of my conscience was starting to sound rather a lot like Singer.

Right, I told myself. Just to see what happens, then.

“What kind of safeguards can you offer me, if we’re trading machine memories?”

“I didn’t say anything about trading,” she answered, too quickly.

Elizabeth Bear's books