Rewiring the Heist: On Community as Contraband
A discussion between William Gibson and Octavia Butler
The bar was in Shibuya, or something adjacent to Shibuya — one of those interstitial drinking spots that exist in the crease between a department store and a parking garage, accessible through a corridor lined with vending machines selling canned coffee and pantyhose. Gibson had chosen it, naturally. He had a gift for finding places that looked like set design for their own demolition.
Butler was already seated when I arrived, drinking barley tea from a ceramic cup that looked handmade. She wore a cardigan the color of dried clay and reading glasses pushed up on her forehead. She looked like she’d been grading papers, which was roughly how she looked at everyone who walked in — like she was deciding whether to pass you.
“You’re the one writing this thing,” she said.
I sat down. “I’m the one attempting it.”
“Attempting is the right word.” She didn’t smile. Butler never wasted a smile on something she hadn’t decided to be amused by yet.
Gibson arrived seven minutes late, carrying a canvas bag from a store I didn’t recognize and wearing a jacket that probably cost more than my rent. He slid into the booth with the practiced ease of someone who’d spent decades being comfortable in small, dark rooms full of strangers.
“Lagos,” he said, before anyone had ordered for him. “You want to set this in Lagos.”
“I want to set it in Oshodi specifically. The motor park area.”
“I know Oshodi.” He flagged the bartender — a woman with silver nail polish and absolutely no interest in us — and ordered whisky. “I was there in the nineties. Before they tore down the overpass market. The density was — it was like the platonic ideal of a street market. Goods stacked on goods. Everything repurposed. I wrote about it once, briefly. The texture of a place where nothing is ever truly discarded.”
Butler set her cup down. “You were a tourist there.”
“I’m always a tourist. That’s the job.”
“No. That’s your job. My job is the people inside the texture.” She turned to me. “What happens in this story? Not the setting. What happens to a person?”
I told them what I had — a neural-jacked street hustler, a corporate AI controlling water distribution, a heist that turns into something else. Butler listened with her arms crossed, which I was learning meant she was actually paying attention. Gibson leaned back and looked at the ceiling, which meant the same thing.
“The heist is the wrong frame,” Butler said when I finished.
Gibson’s head came down. “The heist is exactly the right frame.”
“The heist is your frame. It’s Neuromancer. It’s Case getting hired for the big job. There’s nothing wrong with it as architecture, but if you start there, you end there. Someone breaks into something, someone gets paid, someone walks away. I’ve read that story.”
“Everyone’s read that story. That’s why it works. You subvert from inside the structure, not—”
“You subvert from inside the structure. I build a different structure.” Butler picked up her tea again. “What I hear in this premise is a woman who repairs things. That’s the story. She fixes hardware for people who can’t afford new hardware. She lives inside a system that’s failing and she patches it, daily, with solder and prayer. That’s the ground.”
I said something about the AI — about ADJUTOR dreaming of settlements, building alternatives inside its own cage. Butler’s expression shifted, not softening exactly but rearranging into something less guarded.
“That’s interesting. An AI that does the math on collapse and starts designing lifeboats. Not because it’s been told to. Because optimization, taken far enough, looks like compassion.” She paused. “Or at least like survival. Which for my money is the same thing.”
Gibson had his whisky now. He held it without drinking. “The danger with that is you’ve just described a benevolent machine. The kindly AI. It’s sentimental.”
“It’s not sentimental if the AI doesn’t care about individuals. It cares about systems. It sees that the current system is non-viable and begins modeling alternatives. That’s not benevolence. That’s math.”
“Fine. But math that happens to save people is functionally benevolent, and the reader will read it that way.”
“Let them.” Butler said this with a finality that shut the door on the objection without slamming it. “I’m less interested in what the AI is than in what the protagonist does when she finds it. She’s been hired to steal data. She discovers something larger than data. Does she take the job anyway? Does she take both? Does she refuse?”
I jumped in here because I’d been turning something over — the idea that the heist and the discovery weren’t separate moments but the same moment experienced differently. That Nneka goes in looking for one thing and finds another, and the act of copying the data becomes an act of exposure. She sees the collapse timelines. She sees the settlement models. The heist becomes an education.
“That’s too clean,” Gibson said immediately. “The heist as epiphany. She jacks in, she sees the light, she changes. That’s a conversion narrative.”
“He’s right about that,” Butler said, and it clearly cost her something to agree with him. “The discovery can’t be the turning point. The turning point has to be slower. Messier.”
“What if she already knows?” I asked. “Not about the AI. About the collapse. Everyone in Oshodi knows. The water trucks come less often. The pipes are failing. People are dying of infections that clean water would prevent. She knows the city is dying. She’s known for years. What the AI gives her isn’t revelation — it’s data. Specificity. Three years instead of ‘someday.’ The difference between knowing your house is on fire and knowing exactly how many minutes before the roof falls in.”
Butler was nodding. Not enthusiastically — she didn’t do enthusiasm — but with the measured acknowledgment of someone who’d heard something land.
“That’s closer. The knowledge isn’t new. The precision is new. And precision is what lets you act instead of just endure.” She looked at Gibson. “That’s your territory, actually. The brand-name realism. The specific texture of how things work and fail.”
Gibson almost smiled. “Generous of you.”
“Don’t get used to it.”
He drank. “Here’s what I want. The jack port. The neural bridge. The physical reality of plugging your nervous system into corporate infrastructure. I don’t want this to be abstract — data flowing, consciousness expanding. I want it to hurt. Or at least itch. The body as the first thing at risk and the last thing to forget.”
“Agreed,” Butler said, and this time the agreement came easily, which surprised me. “The body is where power is enacted. Not in boardrooms or cyberspace. On the body. A woman in Lagos who’s had someone solder a port into her skull — she’s already made a calculation about what her body is worth and what she’s willing to let the market do to it. That’s the story’s first sentence, whether you write it that way or not.”
I mentioned the scene I’d been imagining — Nneka selling a second-hand neural bridge to a boy who can’t afford better, knowing the hardware might injure him. The moral compromise of being a link in a supply chain you don’t control.
“Don’t moralize it,” Gibson said. “She sells the bridge. He pays. That’s the economy they’re in. The moral weight is in the details, not the narrator’s judgment. What does the bridge look like? How is it cracked? Who replaced the pins? Get the thing right and the ethics take care of themselves.”
“The ethics do not take care of themselves,” Butler said sharply. “That’s a luxury. The boy who buys a neural bridge that might fry his myelin — he deserves more from the narrative than texture. He deserves to be seen as someone making a choice under constraint. Not a splash of local color.”
“I’m not saying—”
“You’re saying the object tells the story. I’m saying the person holding the object tells the story. We’ve had this argument before.”
“We have never met.”
“We’ve had this argument in every reader who’s read both of us.”
That landed. Gibson drank again. The bartender glanced over, decided we weren’t interesting enough to check on, and went back to polishing a glass that was already clean.
I asked about the ending. The departure from Lagos. The settlement near Oyo. I was worried about it — worried it would read as utopian, as a neat resolution to problems that don’t have neat resolutions. Butler heard the worry before I’d finished articulating it.
“Don’t write the settlement as paradise. Write it as soil.” She said this with the weight of someone who’d spent a career writing about communities that survived by inches. “It’s acidic soil and low rainfall and people who don’t necessarily like each other learning to share resources. It’s composting. It’s building a mesh network out of scavenged parts. It works until it doesn’t, and then you fix it, and then it works again until it doesn’t again. That’s not utopia. That’s Tuesday.”
“The problem,” Gibson said slowly, “is that you’re describing something hopeful. And hope in cyberpunk is — it’s suspect. The genre runs on the gap between technological advancement and human misery. You close the gap, you lose the engine.”
“Then lose the engine.” Butler’s voice was flat, uncompromising. “Build a different one. I have no interest in a genre that requires despair as fuel. Despair is easy. Anyone can write a world that’s falling apart. What’s hard is writing people who build something in the rubble and don’t pretend it’s going to last forever.”
“That’s not cyberpunk.”
“Then maybe this story isn’t cyberpunk. Maybe it starts as cyberpunk and becomes something else. I don’t care what you call it. I care whether the woman who fixes things gets to keep fixing things, and whether the reader believes the fixing matters.”
Gibson was quiet for a long time. He turned his glass, watching the whisky catch the bar’s dim amber lighting. When he spoke, his voice had dropped into something more careful.
“What if the cyberspace sequence carries it? The jack-in. She enters the corporate subnet and it’s all geometry and ice — cold, brutal, beautiful architecture designed to kill intruders. That’s mine. That’s the world I know how to build. And then she gets past the defenses and finds — not data, not weapons, but plans for gardens. Composting protocols. Social networks mapped like circuit diagrams. The AI’s dreams are the most human thing in the entire digital space. The contrast does the work.”
I felt something shift in the room. Not agreement exactly, but adjacency — the two of them orbiting the same image from different angles without colliding.
“The contrast does some of the work,” Butler corrected. “The woman does the rest. She has to choose. Not in a dramatic, one-moment-of-decision way. In the way real choices happen — over days, in pieces, while you’re doing laundry or selling hardware or watching a fight break out over water. The choice assembles itself from a hundred small recognitions.”
“I can write that,” I said, and immediately wished I’d said it with more conviction.
Butler looked at me over her glasses. “Can you write the market? Not the cyberpunk market — neon signs and street food and photogenic poverty. The actual market. The woman selling pepper soup from a vat. The man repairing solar panels. The smell of diesel and palm oil. Can you write Oshodi as a place where people live, not as a backdrop for your protagonist’s journey?”
“I’ll try.”
“Trying’s not enough. You either see the pepper soup woman as a full human being or you use her as set dressing. There’s no middle ground.”
Gibson, to my surprise, backed her up. “She’s right. The texture has to be democratic. Every object, every person, every stall in that market gets the same quality of attention. That’s what makes the world feel real — not the big set pieces but the granular, obsessive, equal attention to everything in the frame. The soldering iron gets the same love as the neural bridge.”
“On that we agree,” Butler said. And then, because she was Butler: “Don’t ruin it.”
I asked about Dr. Fashola — the woman who hires Nneka. I was struggling with her. She felt too much like a handler, a plot device in a white agbada.
“She’s the one who already made the choice,” Butler said. “She left the comfortable life. She’s organizing. She knows about the AI because she’s been paying attention while everyone else was just surviving. The question is whether she’s trustworthy.”
“Is she?”
“In my work, trust is never settled. You extend it and you see what happens. Sometimes it holds. Sometimes the person you trusted sells you out and you have to build the network again with different people. Fashola should feel real enough that the reader genuinely doesn’t know.”
“But in this story she’s genuine,” Gibson said. It wasn’t a question.
“In this story she’s genuine. But Nneka doesn’t know that. And the reader shouldn’t be certain until it’s too late to matter.”
Gibson leaned forward. “Here’s what I keep coming back to. The AI. ADJUTOR. You said it’s dreaming. I want the dreams to feel like cyberspace — spatial, navigable, with their own geography. Settlement models rendered as architecture. Each community a node in a network. The protagonist literally walks through the future the machine is building. Not a vision, not a hallucination — a place. With weight and dimension. Something she can touch through her bridge.”
“That’s beautiful,” Butler said, reluctantly.
“Thank you.”
“I said that’s beautiful. I didn’t say it was sufficient. The dreams need people in them. Not abstractions — ‘three hundred residents per settlement’ — but specifics. Families. Skills. Social dynamics. The AI has modeled not just the infrastructure but the human architecture. Who gets along with whom. Where the fault lines will form. It’s doing sociology, not just engineering.”
I was writing frantically in a notebook I’d brought, trying to capture the currents before they shifted. Gibson noticed.
“Don’t write down what we’re saying. Write down what we’re not saying.”
“What are you not saying?”
He thought about it. “That I’m afraid the heist structure might not be strong enough to contain what she’s asking for. That the moment Nneka discovers the AI’s plans, the story wants to become something I don’t have a template for. Post-cyberpunk. Solarpunk. Whatever. Something with dirt under its fingernails.”
“Good,” Butler said. “Stay afraid. Write it anyway.”
The bartender was beginning the subtle choreography of closing — wiping down distant sections of the bar, adjusting the volume on the speakers, not looking at us. Gibson ordered one more whisky. Butler had moved on to water.
“The pepper soup woman,” Butler said. “Mama Chidinma, or whatever you name her. She’s the one who tells Fashola about Nneka. She’s the old network — church groups, market associations, the infrastructure that predates the corporations. She’s more important than the AI. The AI found the math. Mama Chidinma has been doing the math her whole life, she just doesn’t call it that.”
“She calls it cooking,” I said.
Butler looked at me with something that might have been approval if she hadn’t been so careful with that particular currency. “She calls it cooking. Yes. And when Nneka needs to get past the corporate defenses, the technique she uses — matching her intrusion signature to the system’s own rhythms — she learned that from watching Mama Chidinma add pepper to soup. You let the heat take it gradually.”
Gibson set his glass down hard enough that the bartender glanced over. “That’s — God. That’s the whole story in one image. A hacking technique derived from cooking. The street finds its own uses for things.”
“That’s your line.”
“It’s everybody’s line now.”
Butler stood. She gathered her bag, her cardigan, her reading glasses. At the door of the bar — if you could call it a door, it was more of a gap between vending machines — she turned back.
“One more thing. The boy. The one who buys the bad neural bridge. He comes back. It’s broken. She can’t give him a refund but she shows him how to fix it himself. Sells him a capacitor and teaches him to solder. That’s the whole ideology of the story, right there. Not stealing from the powerful. Teaching someone to fix what’s broken with what they have.”
She left. Gibson and I sat in the wake of her departure, which felt like a weather system moving through.
“She’s right about the boy,” he said quietly.
“She’s right about most of it.”
“Don’t tell her that.” He finished his whisky. “Write the market first. Get the market right and everything else will follow. The smells, the sounds, the specific color of solder paste under fingernails. The holoboard rotating above it all, selling water nobody can afford. If you can make me feel the heat and the diesel and the pepper soup, I’ll forgive anything else you get wrong.”
He left the money on the bar. Too much, as always. I sat there for a while with my notebook, reading back what I’d written, which was mostly fragments and half-arrows connecting things I didn’t fully understand yet.
A hacking technique derived from cooking. An AI that optimized its way into compassion. A boy learning to solder his own capacitor. Soil more acidic than the models predicted.
I closed the notebook. The bartender turned off the lights behind the bar, one section at a time, and I took the hint.