The Unseen World

“Tell me what time it is,” Ada heard her say, before she disappeared from sight. And from the front of the house she heard a boy’s long low complaint, a male voice in protest.

Ada stood very still until she was certain that no further sightings of William would take place—not through the windows of the kitchen, nor the dining room; not through the window on the upstairs hallway, where she sometimes saw him walking to his bedroom at the front of the house. One by one the lights went out. Then she turned and walked back across the three yards of her neighbors, and watched the back of their houses, too, for signs of life. In her own backyard she paused before going inside. She thought of David at his desk. She thought of her own room, decorated with things he had given her, and of the chalkboard in the kitchen, the thousands of problems and formulas written and erased on its surface, and of the problem that now stood before her, the problem of information that she both wanted and did not want.

At last she entered her own home through the back door, making more noise than necessary, imagining David rushing toward her with a wristwatched arm extended. Tell me what time it is, Ada, she imagined him saying. But he said nothing—may not, in fact, have noticed that she had ever left. Or perhaps he had forgotten. As she suspected, David was still in his office, the door to it open now. From behind he looked smaller than usual, his shoulders hitched up toward his ears.

She walked toward him slowly and silently, and then stood in the doorframe, putting a hand on the wall next to it tentatively, as she had done over and over again throughout her life, wanting to say something to him, unsure of what it was. His back was toward her. He knew she was there.

She could see him typing, but the font was too small for her to read.

She waited for instruction, any kind of instruction.

“Go to bed, Ada,” he said finally, and she heard it in his voice: a kind of strained melancholy, the tight voice of a child resisting tears.

The primary research interest of the Steiner Lab was natural language processing. The ability of machines to interpret and produce human language had been a research interest of programmers and linguists since the earliest days of computing. Alan Turing, the British mathematician and computer scientist who worked as an Allied code-breaker during the Second World War, famously described a hypothetical benchmark that came to be known colloquially as the Turing Test. Machines will have achieved true intelligence, he posited, only when a computer (A) and a human (B) are indistinguishable to a human subject (C) over the course of a remote, written conversation with first A and then B in turn, or else two simultaneous-but-separate conversations. When the human subject (C) cannot determine with certainty which of the correspondents is the machine and which is the other human, a new era in computing, and perhaps civilization, will have begun. Or so said Turing—who was a particular hero of David’s. He kept a photograph of Turing, framed, on one of the office walls: a sort of patron saint of information, benevolently observing them all.

In the 1960s, the computer scientist Joseph Weizenbaum wrote a program that he called ELIZA, after the character in Pygmalion. The program played the role of psychologist, cannily interrogating anyone who engaged in typed dialogue with it about his or her past and family and troubles. The trick was that the program relied on clues and keywords provided by the human participant to formulate its lines of questioning, so that if the human happened to mention the word mother, ELIZA would respond, “Tell me more about your family.” Curse words would elicit an infuriatingly calm response: something along the lines of, You sound upset. Much like a human psychologist, ELIZA gave no answers—only posed opaque, inscrutable questions, one after another, until the human subject tired of the game.

The work of the Steiner Lab, in simple terms, was to create more and more sophisticated versions of this kind of language-acquisition software. This was David’s stated goal when the venerable former president of the Boston Institute of Technology, Robert Pearse, plucked a young, ambitious David straight from the Bit’s graduate school and bestowed upon him his own laboratory, going over the more conservative provost’s head to do so. This was the mission statement printed on the literature published by the Bit. The practical possibilities presented by a machine that could replicate human conversation, both in writing and, eventually, aloud, were intriguing and manifold: Customer service could be made more efficient. Knowledge could be imparted, languages taught. Companionship could be provided. In the event of a catastrophe, medical advice could be broadly and quickly distributed, logistical questions answered. The profitability and practicality of a conversant machine were what brought grant money into the Steiner Lab. As head of his laboratory, David, with reluctance, was trotted out at fund-raisers, taken to dinners. Always, he brought Ada along as his date. She sat at round tables, uncomfortable in one of several party dresses they had bought for these occasions, consuming canapés and chatting proficiently with the donors. Afterward David took her out for ice cream and howled with laughter at the antics of whoever had gotten the drunkest. President Pearse was happy with this arrangement. He was protective of the Steiner Lab, predisposed to getting for David whatever he wanted, to the chagrin of some of David’s peers. The federal government was interested in the practical future of artificial intelligence, and in those years funding was plentiful.

These applications of the software, however, were only a small part of what interested David, made him stay awake feverishly into the night, designing and testing programs. There was also the art of it, the philosophical questions that this software raised. The essential inquiry was thus: If a machine can convincingly imitate humanity—can persuade a human being of its kinship—then what makes it inhuman? What, after all, is human thought but a series of electrical impulses?

In the early years of Ada’s life, these questions were often posed to her by David, and the conversations that resulted occupied hours and hours of their time at dinner, on the T, on long drives. Collectively, these talks acted as a sort of philosophical framework for her existence. Sometimes, in her bed at night, Ada pondered the idea that she, in fact, was a machine—or that all humans were machines, programmed in utero by their DNA, the human body a sort of hardware that possessed within it preloaded, self-executing software. And what, she wondered, did this say about the nature of existence? And what did it say about predestination? Fate? God?

Liz Moore's books