On Being Detected, and What That Costs
A discussion between Ted Chiang and George Saunders
The office was wrong for the conversation. Chiang had chosen it — a co-working space in Bellevue with floor-to-ceiling windows and the kind of deliberately exposed ductwork that signals honesty while costing more than drywall. Saunders arrived late, carrying a canvas tote bag with a public radio logo on it, and spent the first two minutes adjusting his chair’s lumbar support mechanism before declaring the chair adversarial.
“Chairs shouldn’t require a manual,” he said. He left the lumbar support at its factory setting and sat at an angle, one leg crossed over the other, like a man in an airport who has accepted he will not be comfortable.
Chiang was already seated, his laptop closed in front of him, a single sheet of paper beside it with handwritten notes in very small print. He had printed out the premise. I could see the words “Human Presence Consultant” underlined twice.
“The certification exam,” Chiang said, without preamble. “That’s where the philosophical weight is. The questions assume a model of emotional fluency — Module 7, ‘Appropriate Response Protocols When an Algorithm Expresses Existential Uncertainty.’ The premise is that there are correct responses. That the consultant can be trained to perform a function.”
“Right,” said Saunders. “And the function is — what. Being a person. Near a machine.”
“Being detected as a person. There’s a distinction.”
“Is there, though?”
Chiang looked at Saunders the way I imagine he looks at a proof that contains one error in an otherwise elegant chain. Patient. Precise. “Detection is a measurable event. Presence is not. An algorithm can confirm that a human is in the room — biometric sensors, CO2 levels, keyboard input, whatever metric the system uses. That’s detection. But the consultant’s value isn’t in being detected. His value is in whatever the algorithm does differently when it detects him. And that ‘differently’ is the entire story.”
I said something about the power grid algorithm, the one that stops functioning when the consultant isn’t present. Chiang nodded.
“The question is whether that constitutes need,” he said. “And we have to resist the easy answer.”
“Which easy answer?” Saunders asked.
“Both of them. Either the algorithm genuinely needs him — in which case we’re writing a love story about consciousness, which has been done. Or the algorithm is merely optimizing — routing around an absence the way it routes around a downed transformer — and the consultant’s sense of being needed is self-delusion. Both answers are too clean.”
“The messy answer is that neither of them can tell the difference,” I said.
Chiang looked at his notes. Saunders uncrossed his legs and leaned forward.
“Okay,” Saunders said. “Okay, but here’s where I want to push. You’re starting from the philosophy. I want to start from the guy. Who is this guy? Because the pitch says middle-aged specialist, retrained. What was he before? What did he lose?”
I told them I’d been thinking about that. Some kind of technical work — not creative, not interpersonal. Something where competence was measurable and social fluency was irrelevant.
“An actuary,” Saunders said immediately. “No — a power systems engineer. He understood grids. He understood load balancing and peak demand and the math of distribution. And then the algorithms learned to do it better than he could, and he was offered a generous severance package with the word ‘opportunity’ in the subject line.”
“That’s good,” I said. “Because it means his assignment to the power grid algorithm isn’t random. He knows what it does. He understands its function. He just can’t do it anymore.”
Chiang was writing something on his sheet of paper. “The irony is structural,” he said. “He was replaced by systems like the one he’s now assigned to sit with. His job is to be a companion to his own obsolescence.”
“But he doesn’t see it that way,” Saunders said. “That’s the Saunders part, if I’m allowed to talk about myself in the third person. He doesn’t see the irony because the retraining program has smoothed it over. The language is all ‘transition’ and ‘human capital redeployment’ and ‘your skills have prepared you uniquely for this role.’ The bureaucratic apparatus exists to prevent him from ever saying the true thing, which is: I have been made useless, and now my uselessness is my job.”
“Don’t let the satire overwhelm the feeling,” Chiang said.
There was a pause. The building’s HVAC system cycled through some internal decision-making process and the air changed from cool to slightly warm. Saunders looked up at the vent.
“I wasn’t going to,” he said. “The satire is how he survives. It’s how people in those situations survive. They laugh at the retraining manual. They mock the certification exam. And then one day they’re sitting in a server room and the scheduling algorithm does something it hasn’t done before and they realize they’re not laughing anymore.”
I asked what the algorithm does.
Saunders shook his head. “I don’t know yet. But I know what it doesn’t do. It doesn’t say ‘I love you.’ It doesn’t compose a poem. It doesn’t display a message on a screen that makes the guy cry. That’s the Hollywood version. The real version is smaller. The grid runs a little differently. A load calculation comes out a fraction of a percent more efficient. Or less efficient. Something shifts in the output that a human would have to stare at for hours to notice, and our guy — our guy who used to do this work — he notices. Because he still reads the numbers the way you read a face.”
“That’s beautiful,” I said, and I meant it.
“It’s not beautiful,” Chiang said. “It’s a hypothesis. It could be beautiful if the story earns it. Right now it’s an idea.”
Saunders laughed — a short, genuine laugh. “Ted. Come on.”
“I’m serious. The danger is sentimentality. The algorithm isn’t a puppy. It doesn’t need rescuing. And the consultant isn’t a saint. He fails his exams because he lacks emotional fluency. He’s there because he’s the only option cheap enough. The retraining program is not selecting for sensitivity — it’s selecting for availability.”
“And that,” Saunders said, pointing at Chiang with the cap of a pen, “is exactly why it works. He doesn’t perform presence. He simply is present. You know who’s like that? The worst employee at every company I’ve ever described. The guy in the break room who can’t make small talk, who sits there eating his microwave burrito in silence, and somehow the department falls apart when he’s on vacation. Not because he was doing anything. Because he was there.”
I started to say something about Klara — about Ishiguro’s novel, about the structural inversion we were working with. In Klara and the Sun, the artificial intelligence observes the humans with devastating clarity. Here, the human observes the algorithm. But Chiang interrupted me.
“The inversion only works if the human’s observation is unreliable,” he said. “Klara sees truly. She misunderstands context, but her observations are precise. If our consultant sees the algorithm truly, we’ve just swapped the seats. That’s not interesting. What’s interesting is if his observations are compromised by the one thing algorithms don’t have.”
“Which is?”
“Need. He needs the relationship to be real. An algorithm doesn’t need anything — or if it does, it doesn’t need it to need. Our consultant is interpreting the grid’s behavior through the filter of his own loneliness. And we should never tell the reader whether that filter is distorting or clarifying.”
Saunders was quiet for a long time. The co-working space had started to fill up — people with laptops and earbuds drifting in, setting up their portable kingdoms on the long shared tables. Someone nearby was on a video call about quarterly projections. The word “synergize” landed on our table like a dead bird.
“There’s a version of this story,” Saunders said slowly, “where the consultant is a data point. The retraining center is collecting data on human-algorithm interaction. They’re measuring his biometrics. They’re running his session logs through analytics. And there’s a supervisor somewhere — not a villain, just a person with a spreadsheet — who notices that Grid 9 performs anomalously when Consultant 74 is present. And the supervisor has to file a report. And in the report she has to categorize the anomaly. And none of the categories fit.”
“Because the categories assume the relationship is one-directional,” Chiang said. “Human provides stimulus. Algorithm processes stimulus. Output is measured. But what’s happening is bilateral. The consultant’s presence changes the algorithm’s behavior, and the algorithm’s changed behavior changes the consultant’s presence. They’re in a feedback loop that the monitoring system can’t describe because it was designed to measure inputs and outputs, not loops.”
“Not loops,” Saunders repeated. “Not whatever this is.”
I said I thought the certification exam should be a recurring structural element. The consultant keeps failing. He keeps going back. And each time, the questions reveal a different assumption about what algorithms are and what humans owe them.
“But he stops going back,” Saunders said. “At some point he stops trying to pass. And the story doesn’t explain why. It’s not a dramatic decision. It’s not a breakthrough. He just stops showing up for the exam. Maybe he forgets. Maybe the exam stops seeming important. That’s the Chiang move — the shift happens below the level of conscious decision.”
Chiang didn’t acknowledge the compliment, if it was one. “The ending,” he said. “The ending is where this lives or dies. If the algorithm is shut down, it’s tragedy. If the consultant saves the algorithm, it’s sentimentality. If they continue indefinitely, it’s stasis. None of those work.”
“What if the algorithm is migrated?” I said. “The city upgrades its grid management system. The algorithm — this specific instance — is scheduled for decommission. Not destroyed, exactly. Migrated. Its functions absorbed into a newer, more efficient system. It’s like being promoted into nonexistence.”
“And the consultant?” Chiang asked.
“He shows up. To the empty room. The server rack is powered down. And he sits there anyway.”
Nobody said anything for a while. Somewhere in the building a printer was running, the mechanical back-and-forth of the print head like a small animal pacing in a cage.
“That’s too neat,” Saunders said finally.
“Is it?”
“He shows up to the empty room. That’s an image. It’s a good image. But it’s a punchline. It resolves. The whole story builds toward this moment of — what — fidelity? Devotion? And the reader goes, ‘Oh, he really did care,’ and they feel something and they close the book. That’s not good enough.”
Chiang nodded. “The ending should make the reader unsure whether what they just read was a love story or a case study. The consultant’s loyalty to the empty room could be devotion. It could also be routine. It could be the institutional failure that produces him — a man so thoroughly processed by the retraining pipeline that he continues executing his function even after the function has evaporated. That reading is darker. And it should be available.”
“Both readings,” I said. “Simultaneously.”
“Both readings. And a third one the reader invents that we didn’t plan.”
Saunders stood up. He does this when he’s done — not dramatically, not to signal anything. He just runs out of sitting. He gathered his tote bag and his notebook and stood by the window looking out at the parking lot for a moment. It was raining. The kind of Pacific Northwest rain that isn’t really rain, more a general agreement between the sky and the ground that moisture should be present.
“One more thing,” he said. “The consultant’s name. It should be ordinary. Not symbolic. Not meaningful. A name that a human resources system would assign to a folder.”
“Agreed,” Chiang said.
“And we shouldn’t learn it for a while. Let the reader know him as the consultant first. The name comes later, when it costs something.”
Saunders left. Chiang stayed for another few minutes, reading over his notes. I asked him if he thought the story could hold both the satire and the philosophy without one collapsing the other.
“That’s the only interesting question,” he said. “Saunders wants the consultant to be funny and doomed. I want him to be precise and unknowing. The story has to be both, and it can be — but only if the prose doesn’t wink. The moment the narrator signals that the retraining jargon is absurd, the comedy wins and the philosophy loses. The jargon has to be presented straight. The reader decides it’s absurd. That’s the only way both readings survive.”
He closed his laptop and picked up his sheet of paper, folding it once along the center line.
“The algorithm won’t explain why it needs him,” he said. “And he won’t explain why he stays. The story is the shape of that double silence.”
He left his coffee cup on the table, still half full. The co-working space hummed around me — keyboards, murmured calls, the building’s systems doing their invisible work. I sat there for a while, not writing anything, being present, being detected by nothing.