THE END OF ALL THINGS

Not literally in the system; my consciousness hadn’t been sucked into a computer or anything. That’s stupid. My consciousness was in my brain, like it always was.

 

But before, my senses had been dropped into the bridge simulation. Everything I could see or sense was inside of it. For those few seconds when the simulator crashed, I was somewhere else. The system the simulator ran on.

 

I wasn’t seeing anything, and then the bridge simulation popped up again, which said to me that the bridge simulator crashing wasn’t entirely unheard of. Control (or whomever) had set up a restart routine to go directly back into the bridge simulator, without giving the pilot any time to figure out what was going on, or to see the computer interface he or she was working within.

 

But that didn’t necessarily mean the pilot was completely locked out of the system.

 

I launched the docking simulation again.

 

If Control knew the program crashed, then that meant it knew where the bugs were—or knew where some of them were. So either it knew where they were and did nothing about them other than relaunching the system directly back into the simulator, or it did something about it and tried to patch the code—and in the process possibly created new bugs when the new code interacted poorly with the old code.

 

Control wouldn’t know anything about the new bugs unless they glitched during a run it was watching. And no one would do what I just did while Control was watching because Control would probably electrocute them for farting around.

 

So: Control didn’t know that this glitch was there.

 

But some glitches are transient and not reproducible. Those are the hardest as a programmer to fix.

 

I ran the simulation exactly as I had before to see if the glitch would replicate in the same way.

 

It did.

 

So I ran it a third time.

 

And this time, when the program crashed, I thought about the commands that, when the system we programmed the bridge simulator on was booted up, would open the diagnostics and modification screens for the system.

 

I thought about them really hard.

 

And two seconds later, there they were.

 

The diagnostics and modifications screens. Ugly and utilitarian, just like they have been since the very beginnings of visual user interface.

 

They were beautiful.

 

They meant that I was into the system.

 

More specifically, I was into the Chandler’s system.

 

Well, a little, anyway.

 

This would be the part of the story where, if this were a video piece, the heroic hacker would spew a couple of lines of magical code and everything would open up to him.

 

The bad news for me was that this was very much not my personal situation. I’m not a heroic hacker with magic code. I was a brain in a box.

 

But I am a programmer. Or was. And I knew the system. I knew the software.

 

And I had a plan. And a little bit of time before anyone was going to bother me again.

 

So I got to work.

 

* * *

 

I’m not going to bore you with the details of what I did. If you’re a programmer and you know the system and the hardware, and the code, then what I did would be really cool and endlessly fascinating and we could have a seminar about it, and about system security, and how any system fundamentally falls prey to the belief that all variables are accounted for, when in fact the only variables accounted for are the ones you know about, or more accurately that you think you know about.

 

The rest of you would have your eyes glaze over and pray for death.

 

I assume that’s most of you.

 

So for the rest of you, what you need to know:

 

First, the work, the first part of it anyway, took more than a single night.

 

It actually took a couple of weeks. And during all that time I waited for the moment where Control, or whoever, looked at the Chandler’s system and found evidence of me wandering around in it, making changes and trying to get into places where I shouldn’t. I waited for the moment they found it, and the moment they decided to punish me for it.

 

But they didn’t.

 

I’m not going to lie. Part of me was annoyed that they didn’t.

 

Because that’s some lax security. All of it was lax. When whoever it is took over the Chandler, they left the system wide open, with only the basic level of security that would have been outmoded right at the beginning of the computer era. Either they were so sure that they didn’t need to worry about security where they were—everyone could be trusted and no one would try to screw with things—or they were just idiots.

 

Maybe both! The level of insecurity was actually offensive.

 

But it worked to my advantage, and without it I would probably be dead, so I shouldn’t really complain.

 

John Scalzi's books