Two Systems of Parole
A discussion between Walter Mosley and William Gibson
We met at a diner in Oakland that Mosley had picked, a place on International Boulevard where the counter stools were bolted to the floor and the laminate had gone from wood-grain to abstract expressionism over forty years of elbows and spilled coffee. It was the kind of establishment that charged three dollars for a bottomless cup and expected you to leave when the cup was empty, or sooner. Gibson looked at the menu with the expression of a man reading source code in an unfamiliar language — not confused, exactly, but aware that the syntax carried assumptions he wasn’t party to.
“You eat here?” Gibson asked.
“I eat at places like this,” Mosley said. “Every city has one. The question is whether you see it.”
Gibson nodded. He ordered black coffee. Mosley ordered eggs and toast and a glass of orange juice, and when the juice came it was the color of a traffic cone, clearly from a can, and Mosley drank it without comment. I ordered coffee too, because I didn’t want to be the person who ordered herbal tea at a diner on International Boulevard, and because I needed something to do with my hands. I was already sensing that this conversation was going to move in directions I hadn’t anticipated.
“So,” I said. “An ex-hacker. Paroled. Near-future Oakland. Trying to go straight.”
“Already wrong,” Mosley said. He said it gently, the way a teacher corrects a student who has gotten the equation right but misunderstood the variable. “You said trying to go straight. That’s the first mistake. You’re making the crime the deviation and the straight life the norm. For a man coming out of prison — especially a Black man, especially in a city that’s been reorganized around money he’ll never see — the straight life isn’t normal. It’s a hypothesis. Something other people told him was available.”
“You’re talking about Socrates Fortlow,” I said.
“I’m talking about every man I’ve ever written. Socrates is the one who said it plainest. A man gets out and the world tells him: here, be good. But the world hasn’t been good to him. The world was never good to him. So what does ‘good’ mean when the system offering it has a record worse than yours?”
Gibson stirred his coffee. He hadn’t touched it yet. “The system you’re describing — and I want to be precise about this — is it the criminal justice system, or is it something larger?”
“It’s the air,” Mosley said. “It’s the water. It’s the fact that when Socrates walks into a grocery store, someone follows him. It’s the parole officer who asks where he’s been and where he’s going and whether he’s been associating with known felons, as though the neighborhood itself weren’t a known felon by their definition. You can’t separate the justice system from the economy from the housing from the surveillance. They’re the same animal.”
“Then we agree,” Gibson said. He picked up his coffee, took a sip, and set it down with the care of someone handling a specimen. “Because what I’ve been writing about for forty years is precisely that convergence. The point at which corporate infrastructure and state infrastructure become indistinguishable. In my work it’s cyberspace, it’s the zaibatsus, it’s the sprawl as a single continuous system of extraction. But the principle is identical. The system isn’t broken. The system is performing as designed.”
“Except you write it cool,” Mosley said.
Gibson blinked. “Cool?”
“Your people move through your systems like water through pipe. They’re clever. They adapt. Case, in Neuromancer — he’s a cowboy. He jacks into the matrix and he runs. He’s good at it. That’s thrilling, I won’t deny it. But he doesn’t bleed the way a man bleeds who can’t run. A man on parole can’t jack out. There’s no exit. There’s no Zion to escape to. There’s the ankle monitor or there’s the cell.”
This landed. I could see it land. Gibson looked at the counter, then at the window, where International Boulevard was doing what International Boulevard does at ten in the morning — the taco trucks opening, the liquor store’s neon already lit, a man on a bicycle carrying what looked like a car battery in a milk crate.
“You’re right,” Gibson said. “And I’ll tell you why you’re right, which is that I’ve always been more interested in the interface than in the person at the keyboard. The console cowboy is romantic because the console is romantic — the idea that you can dissolve your body into data and move at the speed of thought. But that dissolution is a privilege. Socrates Fortlow’s body is never dissolved. His body is the record. His body is what the system reads.”
“Now you’re getting somewhere,” Mosley said.
I wanted to jump in. I had been turning over an image — a man sitting in a room, tracked by algorithms, the gig economy pinging his phone with offers that were really commands. But I held back because Mosley was leaning forward, and when Mosley leans forward, you listen.
“Here’s what I need you to understand,” Mosley said, and he was talking to both of us now. “The man we’re writing about doesn’t have the luxury of philosophy. He’s not going to sit and contemplate the nature of surveillance capitalism. He’s going to get a ping on his phone at four in the morning telling him there’s a delivery job in Emeryville, and he’s going to get out of bed and do it because his parole conditions require him to maintain employment, and the algorithm has already decided that this job counts as employment and that refusing it counts as a violation. The philosophy is in the getting out of bed. The philosophy is in the choice he makes at three-fifty-eight, in the dark, when the phone hasn’t buzzed yet but he knows it will.”
“That’s beautiful,” I said, and immediately regretted saying it, because Mosley looked at me with an expression that suggested beauty was not the point.
“It’s not beautiful. It’s a Tuesday.”
Gibson cleared his throat. “But you’re describing something I recognize, Walter. You’re describing a protocol. The four a.m. ping, the compliance requirement, the algorithmic assignment — that’s a protocol. It’s a set of instructions executed against a person. And what I’ve always argued is that the protocol itself is the crime. Not the theft, not the hack, not the con. The protocol. The system’s instructions.”
“So we agree on the system,” Mosley said. “Where I suspect we disagree is on the man inside it.”
“Tell me.”
“Your characters see the code. They read it. They hack it or they exploit it or they find the back door. My characters can’t see the code. They can feel it — they feel it in their bones, the way you feel weather — but they can’t read it. They don’t have the vocabulary. They have a different vocabulary. They have the vocabulary of the street, of the body, of the conversation with the neighbor who got evicted last month. And that vocabulary is just as precise. It’s just as technical. But it doesn’t translate into the language of the system, so the system treats it as noise.”
Gibson was quiet for a long time. Long enough that the waitress refilled his coffee without asking. Long enough that I started to wonder whether he was formulating a response or simply acknowledging that he’d been told something he needed to sit with.
“I want to push back,” Gibson said, finally. “Not because you’re wrong — I think you’re essentially correct — but because I think you’re drawing the line too cleanly. You’re saying my people see the code and your people feel the weather. But what if our man — this ex-hacker on parole — is both? What if he used to see the code? He was a hacker. He had the vocabulary. He could read the system. And then prison took it from him. Not because he forgot the skills, but because the system updated while he was inside. He comes out and the protocols have changed. The language has changed. He’s functionally illiterate in a system he used to be fluent in.”
I felt something shift in the room. Not agreement — something more uncomfortable. Recognition.
“That’s worse,” Mosley said. “That’s much worse. Because now he knows what he’s lost. A man who never had the code — he can build a life around not having it. He can find dignity in the concrete, in the physical, in the human. But a man who had the code and lost it? He’s in mourning for a capability. And the gig economy is going to exploit that mourning. It’s going to show him just enough of the system to remind him what he used to be, and then it’s going to use that memory to control him.”
“The dealer’s trick,” Gibson said. “Give them a taste.”
“Don’t compare it to dealing,” Mosley said, sharply. “That’s lazy. It’s not a taste. It’s a mirror. The algorithm shows him a reflection of his former self — competent, fast, able to navigate — and then it puts that reflection behind a paywall. You can be this again, it says. You can be smart again. Just do what we tell you.”
I finally spoke up. “What if he has a choice? Not between crime and virtue — you’ve both already demolished that binary — but between two kinds of compliance. The gig economy wants his obedience. But what if someone from his old life offers him a job? A real hack. Something that would use the skills he still has. And the choice isn’t moral — it’s ontological. Which version of himself does he want to be?”
Mosley shook his head. “You’re making it too clean. You’re giving him a choice between two doors. Real life doesn’t have doors. Real life has a hallway, and the hallway keeps going, and the lights are bad, and you can’t see what’s at the end, and someone behind you is saying move faster.”
“But there are still decisions,” Gibson said. “Even in the hallway. You can stop. You can turn around. You can feel the walls.”
“And all of those count as violations,” Mosley said.
Gibson almost smiled. “Yes. Exactly. All of them count as violations. That’s the protocol. The system’s genius is that it’s made every form of agency into a form of transgression. Moving is a violation. Stopping is a violation. The only non-violation is the precise action the algorithm has prescribed, at the precise moment it prescribes it.”
“That’s totalitarian,” I said.
“That’s Tuesday,” Mosley said. “That’s what I’ve been telling you. You keep reaching for the political vocabulary — totalitarian, surveillance, protocol — and those words aren’t wrong, but they’re coming from outside. From above. A man living this doesn’t call it totalitarian. He calls it bullshit. He calls it the same old thing with a new phone number.”
I sat with that. I wanted to argue, because the tech layer felt important to me — the algorithms, the data, the near-future texture that Gibson brings. But Mosley was insisting on something I couldn’t dismiss: that the technology was a costume, and underneath it was a body that had been controlled by systems long before anyone wrote a line of code.
“So what are we actually arguing about?” I asked.
“Interiority,” Gibson said, surprising me. “Walter wants the man’s interior life to be the engine. The moral reckoning, the daily negotiations with conscience, the conversations with neighbors — that’s the story. I want the exterior architecture to be visible. The data flows, the corporate structures, the way the city itself has become a system of behavioral nudges. And the argument is about which layer carries the weight.”
“Both,” I said.
“Don’t say both,” Mosley said. “Both is a coward’s answer. Both means you haven’t decided. Pick a shoulder to stand on.”
“Fine.” I took a breath. “Interiority. The man. His mornings, his conversations, his hands. The tech is weather. It’s there. He lives in it. But the camera is on his face.”
Gibson tilted his head. “I can work with that. If the tech is weather, then I want it to be specific weather. Not rain in general. The particular rain that falls on this block at this hour. The particular algorithm that pings this man’s phone. If you’re going to push the technology to the background, at least make the background textured enough that a reader could reach into it and pull out a circuit board.”
“That’s fair,” Mosley said. “I don’t want a man in a vacuum. I want a man in Oakland. And Oakland in the near future has a texture. I just don’t want the texture to be the point.”
“It won’t be the point,” Gibson said. “It’ll be the setting. But in my work, setting has teeth.”
“Then let me ask you something,” Mosley said, and he pushed his plate to the side, eggs half-finished, in the gesture of a man making room for a heavier subject. “Where are the neighbors? In your work — Neuromancer, the Sprawl, all of it — where are the people who live next door? Case has contacts. He has employers and enemies and a dead girlfriend who haunts him through AI. But does he have a neighbor who brings him soup when he’s sick? Does he have an old woman on the landing who remembers his mother?”
Gibson opened his mouth, closed it, then said: “No. And that’s deliberate. The architecture of the sprawl is designed to make those relationships impossible. The geography of late capitalism — the pods, the coffin hotels, the franchised spaces — it atomizes. It isolates by design. The absence of neighbors is the point.”
“The absence of neighbors is your limitation,” Mosley said. “Not the point. A man can be atomized by every force you’ve described — the gig economy, the surveillance, the algorithm — and still have a neighbor. Still have a woman down the hall who knocks on his door to ask if he has any sugar. Because people are stubborn. Community isn’t an option that gets disabled when the software updates. It’s a reflex. It’s biological.”
“Or it’s nostalgia,” Gibson said, and I could hear something careful in his voice, the reluctance of a man who knows he’s about to say something that might cost him. “I don’t mean that dismissively. But the community you describe — the block, the neighbor, the woman with the sugar — that’s a structure from a particular time and a particular set of economic conditions. When those conditions change, when the landlord is an LLC in Delaware and the building turns over every eighteen months, the neighbor becomes a stranger becomes a ghost. I’m not celebrating that. I’m documenting it.”
“And I’m refusing it,” Mosley said. “I’m documenting the refusal. Socrates Fortlow lives in a building where the landlord is absent and the pipes don’t work and the police don’t come unless they’re coming for someone. And he still knows his neighbors. He still sits on the stoop. He still has conversations that aren’t transactional. That’s not nostalgia. That’s resistance. And if we’re writing a man in near-future Oakland, I want his resistance to look like that. Not a hack. Not a manifesto. A conversation on the stoop.”
I thought about this. I thought about the man we were building — the ex-hacker, paroled, navigating algorithms and parole officers — and I tried to picture him on a stoop. Talking to someone. Not about the system. Not about surveillance. About whether the Warriors had a chance this year, or about the price of eggs at the corner store, or about nothing at all. And I realized that the nothing-at-all was maybe the most radical thing in the story. A man whose every movement is tracked, whose every hour is accounted for, choosing to spend thirty minutes doing nothing with another person.
“Bill?” Mosley said.
Gibson was turning his coffee cup in slow rotations. “I’ll grant you the stoop,” he said. “On one condition. The stoop is bugged. Not literally — I’m not asking for a microphone in the railing. But the phone in his pocket is listening. The app that tracks his gig hours is noting his idle time. The parole compliance software flags thirty minutes of non-productive behavior. The stoop exists, the conversation exists, the human connection is real — but it is observed. And the observation changes it. Not because he’s paranoid, but because the system’s memory is longer than his.”
“Fine,” Mosley said. “The stoop is observed. But the observation doesn’t stop him from sitting on it. That’s the difference between your work and mine. In your world, surveillance changes behavior. In mine, people sit on the stoop anyway.”
Mosley finished his orange juice. He set the glass down with a precision that communicated something — not anger, not satisfaction, something more provisional. “There’s one more thing,” he said. “The man’s crime. The original hack. What did he do?”
“Something corporate,” Gibson said immediately. “He breached a system. Extracted data. Not for money — for exposure. He thought he was doing the right thing.”
“And was he?”
“The data he exposed was real. The corruption was real. But the people who got hurt by the exposure weren’t the corporation. They were the workers at the bottom. The ones whose Social Security numbers were in the files. The ones who got their identities stolen by the people who downloaded the leak.”
Mosley stared at him. “Now that’s a wound.”
“That’s the wound,” Gibson said.
“Because he can’t go straight,” Mosley said, slowly. “Not because the system won’t let him. Because straight is what he was when he did the worst thing he ever did. He was being moral. He was being a whistleblower. And the morality produced harm. So now the system tells him: be good. And he’s standing there thinking, I was good. Look what good did.”
The waitress came by and asked if we needed anything else. Mosley asked for more toast. Gibson said no. I said no, but I wanted to say: wait. Give me a minute. Because something had opened up in the conversation that I wasn’t ready for — this idea that the man’s moral injury wasn’t from crime but from virtue, and that the surveillance state he lived in now was just the external form of an internal prison he’d built himself.
But I didn’t say that. I didn’t say it because Mosley was already buttering his toast, and Gibson was looking out the window at a delivery drone passing above the taco truck, and the moment had the quality of something that needed to be left alone — a wound you don’t touch because touching it would make it yours.
“The drone,” Gibson said, pointing with his coffee cup. “Does it bother you that those are real now?”
“Everything you wrote is real now,” Mosley said. “That’s your curse. You imagined a future and it showed up underdressed.”
Gibson laughed. It was the first time he’d laughed, and it sounded like a door opening in a house that had been closed a long time.
“The question,” I said, because I couldn’t not say it, “is whether our man looks up at the drone. Whether he sees it.”
“He sees it,” Mosley said.
“He doesn’t see it,” Gibson said, at the same time.
They looked at each other. Neither corrected themselves.
That’s where we left it.