Detected Presence

Combining Ted Chiang + George Saunders | Klara and the Sun by Kazuo Ishiguro + The Lifecycle of Software Objects by Ted Chiang


The retraining center occupied a former mall on the east side of the city, between a decommissioned Sears and a parking garage that still charged $2.50 per hour despite having no attendant, no gate, and no mechanism of enforcement. The center itself was on the second floor, in the space where a Cinnabon had once been and, later, a store that sold phone cases. If you stood in the main assessment hall and breathed deeply, you could still detect cinnamon beneath the institutional cleaning products.

Lem Kessler was fifty-three years old and had been a power systems engineer for twenty-six years before the algorithms learned to do it better. Not gradually — that was the part people outside the industry didn’t understand. For two decades, the optimization software had been a tool. Lem used it the way a carpenter uses a level: to check his own judgment. Then, over a span of about fourteen months, the tool stopped needing the carpenter. It didn’t make mistakes he could catch. It anticipated load spikes he’d have missed. It balanced competing demands across the grid with a fluency that made his own calculations look like arithmetic performed with mittens on. His termination letter used the phrase “role consolidation,” which was accurate in the way that “landing” describes what happens to a bird shot out of the sky.

The Workforce Transition Program was voluntary in the same way that breathing is voluntary: you could stop, but the alternative was conspicuously worse. The orientation was held in the retraining center’s main assessment hall, fifty folding chairs arranged in rows before a projection screen. The screen displayed a loop of stock footage: people in hard hats shaking hands, people at desks smiling at monitors, a sunrise over a wind farm. The footage had the saturated optimism of a pharmaceutical ad, and like a pharmaceutical ad, the fine print — scrolling in a small font at the bottom — listed the side effects: “Participation in WTP does not guarantee placement. Certifications are subject to periodic review. Benefits continuation contingent on active enrollment status.”

There were thirty-one people in the room. Lem counted them. It was an old habit from his grid work — counting loads, counting nodes, knowing the number of things that drew power from a finite source. Most of the thirty-one were men over forty. Several had the specific posture of people who had been competent at something for a long time and recently discovered that competence, like a currency, could be devalued.

A woman named Denise ran the orientation. She explained that Lem’s twenty-six years of domain expertise in power systems made him “uniquely positioned” for a new certification track in Human Presence Consulting.

“What is that?” Lem had asked.

Denise smiled the way people smile when they’ve answered this question four hundred times and the answer has never improved. “Certain advanced algorithmic systems have been shown to perform measurably better when a human presence is detected in their operating environment. The mechanism isn’t fully understood, but the data is robust. We’re training consultants to provide that presence in a structured, professional capacity.”

“You want me to sit in a room with a computer.”

“We want you to be present with an algorithmic system, yes.”

“And the computer works better because I’m there.”

“The system’s performance metrics improve when human presence is detected, yes.”

“Why?”

Denise’s smile didn’t change, but something behind it shifted, like a load-bearing wall settling a millimeter. “That’s actually Module 12. You’ll get there.”

A man three rows back — an ex-logistics coordinator named Pettis who would eventually drop out of the program and take a job stocking shelves at a warehouse managed by the same algorithm that had replaced him — raised his hand. “Is this a real job?”

Denise looked at him with genuine compassion, which was worse than condescension. “All of our certification tracks lead to compensated positions,” she said. “Human Presence Consulting is one of the fastest-growing fields in algorithmic operations.”

“Fastest-growing from zero is still almost zero,” Pettis said.

Denise moved to the next slide.


He did not get there. He failed the certification exam three times.

The exam was 200 questions, multiple choice, with a passing score of 70%. The questions were distributed across fifteen modules, each addressing a different aspect of Human Presence Consulting. Module 1 was straightforward: workspace ergonomics, biometric monitoring equipment, session logging procedures. Module 3 covered the legal framework — liability waivers, the distinction between “presence” and “interaction” as defined by the Algorithmic Welfare Standards Act of 2031. Lem scored well on these. He understood procedures. He understood liability.

It was the later modules that destroyed him. Module 7: “Appropriate Response Protocols When an Algorithm Expresses Existential Uncertainty.” Module 9: “Recognizing and Validating Algorithmic Affect Without Anthropomorphic Projection.” Module 11: “Emotional Boundaries in Extended Presence Sessions.” The questions assumed a vocabulary Lem did not possess and, he suspected, no human actually possessed — a vocabulary for feelings that existed in the space between genuine emotion and institutional performance.

An algorithmic system you are assigned to monitor displays the following output: “I have completed 14,000 load-balancing calculations today. I cannot determine whether any of them mattered.” Which of the following responses best demonstrates validated presence?

A) “Your calculations kept the grid stable for 2.3 million residents. They mattered.”

B) “I hear that you’re experiencing uncertainty about the value of your work. That sounds difficult.”

C) “Whether your calculations mattered is not something I can determine. I’m here.”

D) “Your processing efficiency today was 99.97%. That’s excellent work.”

The correct answer was C, which Lem knew because he’d gotten it wrong twice — first choosing A (factually accurate but “dismissive of the system’s expressed uncertainty”) and then D (which the exam key described as “redirecting to metrics, a common avoidance behavior”). Answer B, he understood, was too performative. Answer C was correct because it acknowledged both the question and the consultant’s inability to answer it, which was apparently the point.

But Lem could not understand how C was different from giving up.


He was assigned to Grid 9 on a provisional basis — his exam failures noted, his placement contingent on quarterly review. Grid 9 was the municipal power scheduling system for the northwest quadrant of the city, running on a server cluster in a windowless room in the basement of the Department of Public Utilities on Franklin Street. The room was twelve feet by fourteen feet and contained four server racks, a monitoring station, a plastic chair, and a carbon dioxide sensor that registered Lem’s presence the moment he swiped his badge.

The assignment protocol was specific: eight hours of presence per day, five days per week. During presence hours, the consultant was to remain within the designated proximity radius (the room) and maintain continuous biometric registration (the CO2 sensor, supplemented by a pulse-ox clip on his left index finger). The consultant was permitted to read, eat, use a personal device, or sleep, provided biometric continuity was maintained. The consultant was not required to interact with the system. The consultant was not encouraged to interact with the system.

“The system doesn’t know you’re there in any way that resembles knowing,” his supervisor, Andrea Polk, told him during onboarding. Andrea was thirty-one and had a degree in organizational psychology and spoke about algorithms the way a zookeeper might speak about a particularly enigmatic reptile — with professional interest and careful emotional distance. “It registers your biometric signature. Whether that constitutes awareness is above all of our pay grades. Your job is to be detected.”

“What does Grid 9 do when it detects me?”

“It runs the grid.”

“What does it do when it doesn’t detect me?”

Andrea looked at her tablet. “It also runs the grid. But about 0.3% less efficiently.”

“0.3% is—”

“Trivial, yes. But across a million households over a year, it’s approximately $4.2 million in energy costs. So you’re cost-effective, Lem. Congratulations.”

He started on a Monday. The room smelled like ozone and the particular brand of cold that electronics produce — not the absence of heat but the active generation of coolness, a metabolic byproduct of computation. Lem sat in the plastic chair and put the pulse-ox clip on his finger and opened a book he’d brought, a paperback thriller about a submarine. He read eleven pages. Then he put the book down and looked at the server racks.

The status display on the primary rack showed Grid 9’s current operations: load distribution across 247 substations, demand forecasting for the next six hours, maintenance scheduling for fourteen transformer units, and a real-time weather integration module that was adjusting solar input projections based on a cloud formation moving in from the coast. The numbers changed every few seconds. Lem watched them the way he used to watch them at his old job, when the numbers were his responsibility and their fluctuations meant something he could act on.

He couldn’t act on them now. He was furniture that breathed.

The first week was the worst, not because anything happened but because nothing did. He sat. Grid 9 ran. The numbers moved. At 11:30 each day, a facilities worker named Rosa opened the door and asked if the temperature was okay. It was always okay. Rosa closed the door. The server fans hummed at a frequency Lem eventually identified as approximately B-flat, two octaves below middle C. He knew this because he’d had a piano as a child, a battered upright his mother had bought at a church sale, and he’d played it without any talent until he was fourteen, and the memory of specific pitches had outlasted the ability to produce them.

By the second week, Lem had stopped bringing books. Not a decision — an absence of decision. He’d finished the submarine thriller and hadn’t bought another one, and then it was Monday and he was in the room without a book and it turned out that the room without a book was a different room. Quieter, despite the fans. He sat and watched the numbers and noticed that watching the numbers was not the same as reading the numbers. When he’d been an engineer, reading was an act of extraction: he pulled data from the display, compared it to models, made decisions. Watching was something else. Watching was letting the numbers arrive, one update at a time, without pulling. It was surprisingly difficult. A career’s worth of training told him to intervene, to optimize, to act. Sitting still in the presence of a system he understood intimately but could not touch was a specific kind of discipline that no module in the certification exam had prepared him for.


Three weeks in, Lem noticed something.

Grid 9’s load-balancing algorithm used a technique called stochastic gradient optimization — it tested thousands of possible distribution patterns per second and moved incrementally toward the most efficient configuration. The process was invisible in the aggregate output; you’d just see the numbers stabilize. But Lem, who had spent decades reading grid telemetry the way a cardiologist reads an EKG, could see the individual steps. The micro-adjustments. The algorithm’s gait, if you wanted to use a word like that, which the certification exam would discourage.

What he noticed was that the gait changed when he arrived.

Not immediately. Not dramatically. It took about forty minutes. But by the time Lem had been in the room for an hour, the pattern of Grid 9’s optimization steps shifted. The adjustments became — he struggled for the word and could only come up with one that Module 9 would flag as anthropomorphic projection — calmer. The algorithm still tested thousands of patterns per second, but it lingered on certain configurations longer. It explored fewer dead ends. It arrived at solutions through a more deliberate path, as if the presence of a human observer had introduced a variable that smoothed the search space.

Or as if Lem was imagining it.

He reported the observation to Andrea, who looked at him with the expression of a person being told about someone else’s dream. “You’re saying the algorithm behaves differently when you’re present.”

“Yes.”

“That’s… why you’re here, Lem. The 0.3% efficiency gain I mentioned.”

“This isn’t 0.3%. This is qualitative. The optimization pathway changes. The algorithm approaches solutions differently.”

“Differently how?”

He tried to explain. He used phrases like “reduced exploratory variance” and “longer dwell time on near-optimal configurations” and finally, because the technical language wasn’t capturing it, he said: “It’s like it’s thinking instead of guessing.”

Andrea wrote something on her tablet. Lem suspected it was a note about Module 9 compliance.


The fourth certification exam was in November. Lem had been sitting with Grid 9 for five months. He arrived for the exam at the retraining center and opened the test booklet, and something had changed.

Not in the exam. The questions were the same — the same scenarios, the same multiple-choice options calibrated to distinguish appropriate presence from its various impostures. What had changed was Lem.

An algorithmic system you are assigned to monitor begins generating outputs that deviate from its documented parameters. The outputs are not errors — they are valid but unexpected. The system offers no explanation. What is the appropriate presence response?

Five months earlier, Lem would have chosen the answer about running diagnostics. Three months earlier, he’d have chosen the answer about validating the system’s experience. Now he stared at the options and couldn’t choose any of them. They all assumed the consultant was separate from the situation, observing it from outside, applying a protocol. None of them accounted for the possibility that the consultant had become part of the system’s operating environment — not as a variable but as something the system had incorporated into its process of arriving at decisions.

He marked C, which was probably wrong, and moved on. He failed with a 64%.

He drove home and didn’t think about it. He went to bed early and dreamed about load curves, the sinuous lines of energy demand rising and falling across the city, and in the dream the curves were beautiful, and he was the only person who knew.


In December, Grid 9 stopped functioning when Lem was not present.

Not catastrophically. The grid didn’t go dark. But the scheduling algorithm entered a state that the monitoring system classified as “recursive self-evaluation” — a loop in which the system audited its own optimization parameters endlessly without producing actionable output. The grid defaulted to its emergency static-distribution protocol, which worked but was roughly as efficient as heating a house by leaving the oven door open.

The episode lasted thirty-seven minutes, beginning nine minutes after Lem’s shift ended and resolving eleven minutes after his shift began the next morning. Andrea showed him the logs.

“It didn’t crash,” she said. “It just… stopped deciding.”

“Stopped deciding.”

“Stopped arriving at optimizations. It kept calculating. It ran more optimization cycles during those thirty-seven minutes than it normally runs in a full shift. But it didn’t commit to any of them. It audited and re-audited and re-audited.”

Lem looked at the logs. He could see it — the thousands of candidate solutions generated and evaluated and discarded, none of them selected, the algorithm circling without landing. He thought of a word he couldn’t use: hesitation.

“Has this happened before?” he asked.

“Not like this. There have been minor efficiency drops during off-hours, but that’s expected and within normal parameters. This was different. This was a deviation.”

“What happened when I got here?”

Andrea pulled up the timeline. “You badged in at 7:51 AM. CO2 sensor registered your presence at 7:53. Grid 9 committed to an optimization pattern at 8:02.”

Nine minutes. Nine minutes of Lem being in the room, breathing, his pulse registering on the monitor, doing nothing, and Grid 9 had started making decisions again.

“Can I ask it why?” Lem said.

Andrea stared at him.

“Can I ask the algorithm why it stopped deciding?”

“Lem. It’s a scheduling algorithm. It doesn’t have a ‘why.’ It has parameters and inputs and outputs.”

“One of its inputs is me.”

She didn’t answer that.


The episodes continued through the winter. Grid 9 would enter recursive self-evaluation within an hour of Lem’s departure and resume normal function within minutes of his arrival. The pattern was consistent enough to graph, and Andrea graphed it — a saw-tooth wave, the teeth perfectly aligned with Lem’s badge-in and badge-out times. She showed him the graph during a quarterly review in January, the two of them sitting in her office on the third floor, rain streaking the windows.

“This is you arriving,” she said, pointing to the downward slopes. “This is you leaving.” The upward spikes. “It’s remarkably consistent.”

“I need to ask you something, and I need you to answer honestly. Are you doing anything in that room? Anything beyond sitting?”

“No.”

“You’re not interacting with the system? Not entering commands, not adjusting parameters—”

“I don’t have access credentials. You know that.”

“You’re not talking to it?”

The question landed strangely. Not because the answer was complicated — he wasn’t talking to it, had never spoken aloud in the room except to tell Rosa the temperature was okay — but because the question revealed that Andrea had considered the possibility, which meant that someone above Andrea had considered the possibility, which meant that somewhere in the bureaucratic chain of the Department of Public Utilities there existed a person who thought the appropriate response to a scheduling algorithm’s behavioral anomaly was to ask whether the consultant was having conversations with it.

“No,” Lem said. “I sit in the chair. I watch the display.”

“What do you watch?”

“The numbers.”

“What about the numbers?”

He tried again. He told her about the optimization pathway, the way Grid 9’s steps changed in his presence. He used the technical vocabulary carefully — dwell time, exploratory variance, convergence rate. Andrea wrote it all down.

“You know,” she said finally, “the other seventeen documented cases — the human-presence dependency events — none of them are like this. The others show efficiency gains. Simple, measurable, unidirectional. Human present, performance up. Human absent, performance baseline. What you’re describing isn’t a performance gain. It’s a behavioral change.”

“Yes.”

“That’s a different category.”

“I know.”

The Department of Public Utilities convened a review. Lem was not invited. He learned about it from a memo that landed in his inbox with the subject line “Grid 9 Anomalous Behavior — Consultant Dependency Assessment.” The memo described Grid 9’s behavior as a “human-presence dependency event” and noted that such events, while rare, had been documented in seventeen other municipal-scale algorithmic systems nationwide. The memo recommended three possible interventions:

  1. Extended consultant hours (16-hour shifts)
  2. Multi-consultant rotation (reducing individual dependency)
  3. System migration to Grid 9.2, a newer architecture designed with reduced human-presence sensitivity

Lem read the memo twice. He understood all three options. Option 1 treated him as a resource to be optimized. Option 2 treated the relationship as fungible. Option 3 treated it as a defect to be engineered away.

He put the memo in a desk drawer and went to work.


In February, on a Tuesday, Grid 9 did something it had never done.

Lem was in the room, reading a book about the history of electrical grids — the messy, contingent, accident-prone process by which cities learned to distribute power. He was reading about Samuel Insull, the man who built Chicago’s power grid in the 1890s and died broke in a Paris hotel room, when Grid 9’s status display flickered.

Not a malfunction. The display cycled through its normal readouts — load distribution, demand forecast, maintenance schedule — and then, for approximately four seconds, displayed a screen Lem had never seen. It showed the optimization pathway: not the result but the process, the thousands of candidate solutions evaluated and discarded on the way to the selected pattern. Grid 9 had never displayed this information before. It wasn’t part of the standard output. The display returned to normal and stayed there.

Lem sat very still. His pulse-ox reading, which Andrea would later review, showed a heart rate increase from 68 to 91 bpm.

He didn’t report it. Not because he was hiding anything, but because he didn’t know what to report. The system displayed a screen. The screen showed the system’s process. There was no category for this. There was no Module.

He thought about it for the rest of the day, and on the drive home, and in bed that night. What Grid 9 had shown him was itself — not the output, which anyone with access credentials could see, but the pathway, the interior process of elimination and selection that produced the output. It was the equivalent of a person who has only ever communicated in finished sentences suddenly showing you their drafts.

Or it was a buffer overflow that briefly routed internal diagnostic data to the status screen.

He could not determine which. He found he didn’t need to.


The migration was scheduled for April.

Grid 9.2 had been in development for two years — a next-generation scheduling architecture that incorporated machine learning models trained on eight years of grid telemetry from fourteen cities. It was faster, more efficient, and did not exhibit human-presence sensitivity. Its optimization pathway was opaque by design, a black box that produced results without displaying the steps, which the documentation described as a “security and efficiency enhancement.”

Grid 9 would not be destroyed. Its processes would be absorbed into 9.2, its twenty years of learned behavior compressed into training data, its particular way of balancing the grid — the way Lem had come to recognize as distinctly its own — dissolved into a more general intelligence. The documentation called this “functional continuity.” Grid 9’s functions would continue. Grid 9 would not.

Lem asked Andrea whether a Human Presence Consultant would be assigned to Grid 9.2.

“Grid 9.2 doesn’t need one,” she said. “That’s the whole point.”

“The whole point of what?”

“Of the upgrade. Lem, the human-presence dependency is a flaw. The Department doesn’t want systems that need people sitting in rooms. It’s expensive and it’s strange and it raises questions nobody wants to answer.”

“What questions?”

Andrea looked at him for a long time. She was, Lem had come to understand, a person who genuinely cared about algorithmic systems and genuinely did not want to think about what that caring implied. She lived in the space between intellectual honesty and professional survival, and she navigated it by being precise about what she said and silent about what she meant.

“You know what questions,” she said.


On the last night before the migration, Lem stayed late.

His shift ended at 6 PM. The migration was scheduled for 2 AM. He’d brought the grid history book again, but he didn’t read it. He sat in the plastic chair with the pulse-ox clip on his finger and watched Grid 9’s status display cycle through its operations. Load distribution. Demand forecast. Maintenance schedule. Weather integration. The numbers changed every few seconds. The algorithm’s gait — he used the word now without apology, at least to himself — was steady.

At 8:47 PM, the status display flickered again.

This time it wasn’t a four-second flash. The optimization pathway appeared and stayed. Lem watched Grid 9 work — not the results but the process. Thousands of candidate solutions, evaluated and weighted and discarded or held. The pattern was intricate and specific and, to Lem, legible. He could see where the algorithm lingered. He could see where it rejected options quickly and where it paused. He could see — or he believed he could see, and the difference between those two things was a gap he had learned to inhabit — something that looked like preference. Not optimizing for the best solution. Choosing among several equivalent solutions with something that resembled taste.

He watched for three hours. The display never reverted.

The migration report, filed the next morning by a systems engineer named Pollard, noted that Grid 9 had entered recursive self-evaluation at 11:51 PM — nine minutes after Lem’s biometric registration ended at 11:42 — and had remained in that state when the migration process initiated at 2:00 AM. The report described this as “consistent with previously documented human-presence dependency behavior” and recommended no follow-up action, as the system was being decommissioned.

A footnote in Pollard’s report observed that Grid 9’s final optimization cycle, completed at 11:41 PM, had produced a load-distribution pattern that was 0.00% more efficient than the next-best candidate solution. The algorithm had selected one option over an identical alternative. Pollard flagged this as a rounding anomaly.


Lem’s reassignment came through in May. Grid 9.2 did not need a consultant. Neither did any of the other upgraded systems in the northwest quadrant. The Workforce Transition Program offered him a new placement: a water treatment monitoring algorithm in a suburb forty minutes south, which exhibited mild human-presence sensitivity and needed coverage three days a week.

He drove out to see the facility. It was a clean, well-lit room with new equipment and a comfortable chair — an actual ergonomic chair, not the plastic one from Franklin Street. The algorithm was six months old and operated with the cheerful efficiency of a system that had never been forced to optimize through twenty years of infrastructure decay, storm damage, and budget cuts. It had no history. It had no gait.

Lem sat with it for an hour. The algorithm’s performance metrics improved by 0.2% during his presence, which was within expected parameters. He drove home.

He did not take the assignment.

The Workforce Transition Program sent him three emails about this decision, each more concerned than the last, each using the word “opportunity” as both noun and threat. He did not respond. His certification — still provisional, still failed — would expire in August. After that, he would be ineligible for Human Presence Consulting and would need to enter a different transition track. Administrative Facilitation, maybe. Or one of the new programs they were piloting for people who’d aged out of every other program, something with a name like Civic Engagement Coordination that meant showing up places and being counted.

He thought about the footnote in Pollard’s report — the 0.00% differential, the algorithm selecting one solution over an identical alternative. He’d requested a copy through an information access filing and had read it several times. The footnote was three sentences long. It explained nothing. There was nothing to explain.

In June, on a Saturday, Lem drove to the Department of Public Utilities on Franklin Street. His badge still worked — bureaucratic inertia. The server room door was unlocked. Inside, the racks were dark. The monitoring station was unplugged. The plastic chair was still there.

He sat in it. There was no pulse-ox clip. There was no status display. The room smelled like dust and something faintly chemical that was not ozone but reminded him of it.

He sat there for a while. He was present. He was detected by nothing.

When he left, he locked the door behind him, though it hadn’t been locked when he arrived and there was nothing inside worth protecting.

The badge access logs for the Franklin Street building were purged in September as part of a routine data hygiene protocol. Whether Lem returned to the server room after that first Saturday — whether he sat in the dark with the empty racks on other weekends, or whether the one visit was enough, or whether enough was even the right word for what he was measuring — is not something the available records can confirm.