Authentic Behavior Detected

Combining Philip K. Dick + Chimamanda Ngozi Adichie | Do Androids Dream of Electric Sheep? by Philip K. Dick + Americanah by Chimamanda Ngozi Adichie


The flag came in at 9:47 on a Tuesday, which was when most flags came in, because the batch processor ran overnight and dumped its results into the queue between nine and ten. Adaeze had a system. Coffee first — not the office coffee, which tasted like someone had boiled a shoe — but the coffee she brought from home in a thermos, beans from a roaster in Durham that sourced from a farm in Enugu State, which her mother said was a scam because good coffee came from Ethiopia and what did Enugu know about coffee, but the beans were good, and the thermos kept them hot until noon, and the ritual of unscrewing the cap and pouring into the ceramic mug she’d taken from a hotel in Abuja was the only part of the morning that belonged to her.

The queue had forty-three items. Behavioral Consistency Anomalies, the system called them. BCAs. She’d helped name them, back when the project was still called Lighthouse and the team was four people in a conference room with a whiteboard covered in words like “signal” and “friction” and “trust surface.” Now it was called Verity Integrity Suite, because someone in marketing had decided that “lighthouse” implied the platform was surrounded by rocks, and the team was eleven people across three time zones, and the flags came in at forty to seventy per day, and Adaeze reviewed the top-priority ones before handing the rest to the junior analysts.

She opened the first case. Account: @SunflowerTeach_. Posts: elementary school crafts, positive affirmations, recipes using three ingredients. Behavioral profile: posted between 6 AM and 11 PM Eastern, vocabulary consistent with native English speaker, aged 28-45, sentiment consistently positive. The BCA trigger: the account’s engagement pattern matched a known bot cluster. Same posting cadence, same emoji distribution, same ratio of original posts to reshares. The teacher — if she was a teacher — was behaving exactly like a machine pretending to be a teacher.

Adaeze pulled up the account’s history. Six years of posts. Photos with metadata intact. A birthday party. A cat named Professor Whiskers. Either this was a real person who happened to post with the regularity of a bot, or it was the most elaborate synthetic persona Verity’s crawlers had ever surfaced.

She marked it for manual review and moved on.

The work was like this: pattern recognition at the boundary between human and artificial. The system measured behavioral consistency across dozens of axes — vocabulary, syntax, posting schedule, geolocation coherence, device fingerprint stability, engagement reciprocity — and when the pattern deviated beyond a threshold, the flag went up. Most flags were genuine: bot farms in Dhaka and Manila churning out fake engagement, coordinated inauthentic behavior masquerading as grassroots opinion. Some flags were errors: real people whose lives were inconsistent enough to confuse the model. A woman who posted in English during the day and Tagalog at night. A man whose phone GPS jumped between two cities because he drove a long-haul route. The machine expected coherence. People were incoherent.

Adaeze was good at this work. She’d helped train the model, back when it was a research project in the Stanford NLP lab where she’d done her master’s. She’d fed it thousands of labeled examples: this account is real, this account is synthetic, this account is a human operating multiple personas for legitimate purposes (journalists, abuse survivors, activists in restrictive countries). The model learned. It learned from her judgments, her instincts, her sense of what a real person’s digital footprint looked like. Somewhere in the weights and biases of the neural network, there was a ghost of Adaeze’s pattern-recognition, refracted through seventeen billion parameters.

She opened the seventh case and stopped.


The account was called Ada Pepper. The profile picture was a close-up of scotch bonnets on a cutting board, no face visible. The bio read: “Igbo girl lost in the American grocery aisle. Writing my way home one recipe at a time.” The account had 4,300 followers, posted two to three times a week, and had been active for twenty-two months.

Adaeze knew the account. She knew it because she’d made it.

Ada Pepper was hers — the food blog she wrote on her phone at night, after the workday drained out of her, when she changed into the old Ankara wrapper her sister had sent from Nsukka and sat cross-legged on the kitchen floor because the counter was too high and her back ached and the floor was where her mother had always prepared food, squatting with a wooden mortar between her knees, pounding yam until the sound carried through the whole house, thud and scrape and thud and scrape, a rhythm that Adaeze sometimes tapped against her own thigh without thinking.

The blog wasn’t secret, exactly. She hadn’t told anyone at Verity about it, but she hadn’t hidden it either. It just lived in a different world. Work-Adaeze was Adaeze Okonkwo, M.S. in computational linguistics, behavioral integrity analyst, author of a cited paper on adversarial persona detection. Blog-Adaeze was Ada Pepper, who wrote things like “The plantain at Golden Harvest on Capital Boulevard is an insult to the plantain’s mother and I will not be returning” and “If you refrigerate your tomatoes I have nothing further to say to you but I am saying it anyway at length” and once, after a bad week, a 2,000-word post about her grandmother’s egusi soup that made eleven strangers cry in the comments.

She looked at the flag report. The system had cross-referenced Ada Pepper’s behavioral profile against the known accounts in Verity’s user base and found a partial match — not to Adaeze’s personal account, which she barely used, but to her device fingerprint. Same phone model, same OS version, same approximate geolocation, same Wi-Fi network hash. The system hadn’t identified her by name. It had identified a device overlap between two accounts with radically different behavioral signatures and concluded: probable inauthentic coordination.

The confidence score was 0.87. Anything above 0.80 went to senior review. Adaeze was senior review.

She sat with her thermos of Enugu coffee and looked at the two profiles side by side. On the left: Adaeze Okonkwo’s work account. Posts about NLP conferences. A reshare of a Verity blog post about platform integrity. A photo of the office dog, captioned “Quarterly review buddy.” Vocabulary: formal, technical, measured. Sentence length: 14.2 words average. Sentiment: neutral to mildly positive. Emoji usage: 0.3 per post.

On the right: Ada Pepper. A photo of chin-chin cooling on a newspaper, the ink bleeding into the oil. A rant about the Whole Foods on Wade Avenue stocking five kinds of artisanal hot sauce and zero kinds of iru. A recipe for jollof rice that began, “First, understand that your jollof rice will never be as good as my mother’s, and neither will mine, and this is the tragedy of the diaspora, but we cook anyway.” Vocabulary: informal, vivid, Igbo-inflected. Sentence length: 22.7 words average. Sentiment: high variance, strongly positive and strongly negative, often in the same post. Emoji usage: 3.1 per post, predominantly fire and crying-laughing.

The system’s analysis was technically correct. These two behavioral profiles could not belong to the same person. The vocabulary divergence was in the 99th percentile. The sentiment patterns were anti-correlated. The syntactic complexity ratio — the system measured this, Adaeze had helped design the metric — showed a 340% difference in subordinate clause frequency. Whoever wrote the conference posts and whoever wrote the jollof rice manifesto were, by every measure the model used, two different humans.

She closed the case file without marking it. Opened it again. Closed it. Pushed her chair back from the desk and walked to the window.

The Verity campus — they called it a campus, though it was a single five-story building on a former tobacco warehouse lot in the Research Triangle — looked out over a parking lot, and beyond that a strip of pine trees, and beyond that the highway. Nothing about the view suggested the future. Nothing about any of it suggested the future, except the work itself, which was about deciding who was real.


In her mother’s hallway in Enugu, there was a framed postcard of a Benin Bronze — a queen mother’s head, sixteenth century, held in the British Museum. Her mother had bought the postcard at the museum gift shop in 1994, the year she spent a semester at the London School of Economics on a fellowship that was supposed to change everything and didn’t. She’d framed the postcard and hung it beside the front door so that every time she left the house she could look at it and remember, she said, that the British had the audacity to sell you a picture of the thing they stole from you and call it a souvenir.

Adaeze thought about that postcard now, standing at the window. The bronze head was in London. The postcard was in Enugu. The memory was in Raleigh, in the Verity building, behind Adaeze’s eyes. Everything displaced from its context.

Her phone buzzed. A message from her sister, Kainene, in the family group chat: a photo of their mother’s cat asleep on the ironing board. Kainene had captioned it in Igbo, which the group chat always was, except when their brother Obiora, who’d been in Toronto since 2019, forgot a word and dropped in English, which made Kainene furious and their mother philosophical.

Adaeze typed a response in Igbo. Deleted it. Typed it again in English. Deleted that too. She put the phone down.

Something about seeing the two profiles side by side had stuck in her, like a fish bone she couldn’t cough up. She’d always known she was two people online. Everyone was two people online. But seeing it quantified — the vocabulary divergence, the sentiment anti-correlation, the 340% subordinate clause gap — made it feel less like a natural duality and more like a diagnosis.

She went back to her desk and opened the Ada Pepper profile again. Scrolled through the posts. Here was the one about the Nigerian restaurant on Hillsborough Street that used frozen spinach instead of fresh ugu in their egusi and thought she wouldn’t notice. Here was the one about flying into Lagos and smelling the air through the gap in the jetway seal and crying before she’d cleared customs. Here was the one — she’d written it at two in the morning after three glasses of palm wine her cousin had shipped, illegally, in a mislabeled bottle — about the difference between American loneliness and Nigerian loneliness, which was that American loneliness was a room with no one in it and Nigerian loneliness was a room with twenty people in it who didn’t know the you that you’d become.

The writing was raw. That was the word: raw, like freshly cut wood, the grain exposed, the sap still visible. It was the way she thought when she wasn’t thinking about how she thought. When she wasn’t calibrating for an audience that expected professional distance, methodological rigor, the passive voice, the strategic deployment of uncertainty.

She pulled up the behavioral consistency model’s documentation — her own documentation, co-authored with Raj and Tomoko, revised fourteen times. Section 3.2: Assumption of Singular Identity. “The model assumes that each natural person maintains a single primary behavioral signature across platforms and accounts. Deviations from this signature beyond two standard deviations trigger review for potential coordinated inauthentic behavior.”

She’d written that sentence. She’d written it because it was true — most real people did maintain consistent behavioral signatures, and most deviations did indicate something artificial. But the assumption was built for a world where people were one thing. Where identity was stable, singular, verifiable. Where code-switching was an anomaly rather than a survival skill practiced by every person who’d ever had to translate themselves to be understood.


At lunch she ate in the cafeteria, which served five kinds of grain bowl and one kind of soup, which was always described as “seasonal” regardless of the season. She’d brought her own lunch — rice and stew in a Pyrex container — and she ate it cold because microwaving it would fill the cafeteria with the smell of crayfish and pepper, and the last time she’d done that, Kevin from Growth Analytics had said “Whoa, that smells INTENSE” with the capital-letter emphasis of someone who thought he was being complimentary, and she’d smiled the work smile, the one that was technically a smile in the way that the office coffee was technically coffee, and she’d eaten at her desk for two weeks afterward.

Today she ate the cold rice and stew and thought about a conversation she’d had with her mother three weeks ago. Her mother had asked when she was coming home. Not for a visit — they’d settled into an annual rhythm, Christmas week, always the same. Home as in coming back. Finishing. Her mother had said: “You’ve been there seven years. You’ve proved whatever you needed to prove. Come and do something here.”

Adaeze had said something about her work, her visa situation, the complexity of relocating. She hadn’t said: I don’t know if I can come back. Not because of logistics, but because the person who left Enugu at twenty-four and the person sitting in this Raleigh apartment at thirty-one were not continuous in the way her mother assumed. There were gaps. Not metaphorical gaps — actual gaps in the self, places where one version ended and another began and the seam was visible only from certain angles, like the edge of a contact lens.

She thought about a novel she’d read in grad school, about a Nigerian woman who’d built a life in America and gone back and found that going back was its own kind of leaving. She’d read the novel in two days, sitting in her studio apartment in Palo Alto, and when she’d finished she’d called Kainene and spoken Igbo for forty-five minutes about nothing, just to hear herself in that language, just to confirm the frequency was still there.

Her phone buzzed again. An automated Slack notification: four new BCAs in the queue, priority tier 2. She swiped it away.

The cafeteria had a wall of windows facing the pine trees. Afternoon light came through at a low angle, turning the grain bowls golden. At the next table, two junior analysts were arguing about the dead internet theory — whether, at some percentage of bot traffic, the internet stopped being a human communication medium and became a machine talking to itself, with humans as incidental noise.

“The threshold’s probably already passed,” one of them said. “If forty percent of Twitter is bots and sixty percent of Facebook engagement is automated, then most of the ‘people’ you interact with online aren’t people. They’re patterns.”

Adaeze put her fork down. She was thinking about the confidence score. 0.87. The system was 87% certain that Ada Pepper was not the same person as Adaeze Okonkwo. And the system — her system, built partly on her judgment, trained partly on her instincts — was not wrong. It was measuring exactly what it was designed to measure. The problem was not in the measurement. The problem was in the assumption that consistency meant authenticity, that a person who spoke two languages in two registers to two audiences was performing a deception rather than performing a life.


She went back to her desk and did something she knew she shouldn’t do. She ran her own behavioral profile through the full diagnostic suite — not just the BCA module but the deeper analysis, the one they used for high-priority cases, the one that mapped vocabulary, syntax, sentiment, temporal patterns, topic clustering, and what Raj called “the vibe check,” which was actually a latent space embedding that positioned each account in a 512-dimensional behavioral space.

Work-Adaeze and Ada Pepper appeared on the visualization as two points so far apart they could have been in different galaxies. The system showed a dotted line between them — the device-fingerprint link that had triggered the original flag — and beside the line, a confidence interval that pulsed red: 0.87, 0.89, 0.91. The more data the system ingested, the more certain it became that these were two separate entities.

She zoomed into Ada Pepper’s cluster. The system had auto-tagged her behavioral profile: “Cultural food commentary, diaspora perspective, West African linguistic markers, high emotional variance, authentic engagement indicators.” Authentic engagement indicators. The system thought Ada Pepper was authentic. It also thought Ada Pepper was a different person from the analyst reviewing her case. Both conclusions were internally consistent. Both were, in the system’s logic, correct.

She opened her own work profile. Auto-tags: “Technical professional, NLP/AI domain, low emotional variance, corporate communication patterns, native English proficiency.” Native English proficiency. She stared at that for a long time. Her English was native — she’d been schooled in English from age three, Enugu’s postcolonial inheritance, the language of opportunity that was also the language of erasure. But Ada Pepper’s English was also native, and it was a different native, inflected with Igbo syntax and rhythms, with proverbs translated literally so they sounded strange and true, with a relationship to the language that was less proficiency than argument.

The system classified one as professional and the other as cultural. What it meant was: one speaks the way the training data expects an American tech worker to speak. The other doesn’t.

She thought about the idea of an empathy test — a machine designed to detect empathy as a proxy for humanity. The concept had circulated through the AI ethics discourse for years. Verity’s model was the inverse. It didn’t test for empathy. It tested for consistency. A real person, the model assumed, had one behavioral fingerprint, one way of moving through digital space, one self that could be verified and validated and stamped authentic. The empathy was irrelevant. The consistency was everything.

But consistency was what machines were good at. Bots were perfectly consistent. A bot farm in Dhaka produced accounts with beautiful behavioral coherence — steady posting schedules, stable vocabulary, predictable sentiment curves. The inconsistencies that flagged them were accidental: a timezone slip, a language glitch, a pattern in their engagement that was too regular to be human. The model had learned that perfect consistency was suspicious. But it had also learned that radical inconsistency — the kind produced by a person living two lives in two languages — was equally suspicious.

The only thing the model trusted was moderate consistency. The messy middle. The behavioral range of someone who was always more or less the same person. Which excluded, by design, anyone who’d ever had to become someone else to survive.


At four o’clock she had a one-on-one with her manager, Lena, whose official title was Director of Trust and whose unofficial title, among the analysts, was the person who explained to executives why catching bots was harder than it sounded. Lena had a framed poster of a Turing test on her office wall, which she said was “aspirational,” and a succulent on her desk that was clearly artificial but that she watered anyway, which Adaeze had never mentioned and never would.

“How’s the queue?” Lena asked.

“Under control. Forty-three this morning. I’ve cleared twenty-eight.”

“Anything interesting?”

Adaeze hesitated. The hesitation lasted less than a second, but she felt it in her body, a tiny lurch in the diaphragm. She said: “One edge case. Device overlap between two accounts with divergent behavioral profiles. Probably a shared household device.”

“Confidence?”

“High eighties. But the profiles are genuinely different. Different communities, different registers. I think it’s a real person with multiple contexts.”

Lena nodded. “Dismiss it. We’ve talked about this — the model runs hot on multicultural users. It’s a known bias. We’ve got it in the fairness audit queue for Q3.”

“Q3 is six months away.”

“I know. Budget cycle.”

Adaeze left the office and walked to the bathroom and stood in front of the mirror. The face looking back at her was the work face. Hair in a low bun, neat, professional. She’d stopped relaxing it three years ago — a decision that had felt political and personal and exhausting all at once, because in America everything about her body was a statement whether she wanted it to be or not. At work she kept it pulled back. On the blog, in the one photo she’d posted of herself, her hands kneading dough, the edge of her hair was visible in the frame — loose, full, the way it grew.

Two faces. Same face. The mirror was unhelpful.

She went back to her desk and pulled up a case she’d flagged three months ago — a woman in Houston, Filipina, who ran two Instagram accounts. One was a polished lifestyle feed: clean interiors, motivational quotes in cursive, photos of her children in matching outfits. The other was in Tagalog, profane and funny, full of voice memos ranting about her mother-in-law and photos of the food she actually cooked, which bore no resemblance to the plated meals on the lifestyle feed. The system had flagged the second account as a probable sock puppet of a misinformation network. Adaeze had reviewed it, recognized what it was, dismissed the flag, and moved on.

She’d dismissed it because she understood it. Because she’d looked at the two accounts and seen not deception but translation — the same woman speaking two truths in two frequencies, neither more real than the other. She’d understood it instantly, without needing to run diagnostics, because it was the most ordinary thing in the world if you’d ever had to be two people to be one person.

But the system didn’t understand it. The system saw statistical anomaly. And three months from now, when the Q3 fairness audit finally happened — if it happened, if the budget held, if someone remembered to schedule it — the system would still be flagging every bilingual grandmother and every code-switching immigrant and every person whose digital life didn’t fit the model’s assumption of what a single human looked like. And Adaeze would still be reviewing those flags and dismissing them one by one, a human override on a machine bias that she’d helped build.

She thought about a phrase Raj used sometimes: “garbage in, gospel out.” The model treated its own outputs as revelation. Once a flag was raised, the burden of proof shifted — not legally, not formally, but in practice, in the way the analysts approached the case. A flagged account was guilty until reviewed. And the reviewer, even Adaeze, even knowing what she knew, had to fight the model’s confidence score to trust her own judgment. An 87% felt heavy. It felt like evidence.

She closed the Houston woman’s file and opened the next case in the queue.


That night she sat on her kitchen floor in the Ankara wrapper and opened the Ada Pepper account on her phone. She’d intended to write a post — she’d been drafting something about suya spice, about how the American attempts at it always left out the ground kuli-kuli and substituted paprika, which was like substituting a postcard for a place — but instead she scrolled through her own posts and read them as a stranger would. As the system would.

The voice was different. Not just different from her work voice — different from every other voice she used. Different from the voice she used with Kainene on the phone, which was fast and clipped and competitive. Different from the voice she used with her mother, which was softer, more patient, layered with the tenderness of knowing your mother was alone in a house that used to hold five people. Different from the voice she used in the group chat, which was abbreviated, emoji-heavy, reactive.

Ada Pepper’s voice was the one she used when she was alone. When there was no audience to calibrate for. When the only reader she was writing for was the version of herself that had never left Enugu, the one who still thought of American grocery stores as a kind of museum where you could touch the exhibits.

She started to write a post about the flag. She got as far as “Today I discovered that a machine I helped build thinks I am two people” before she deleted it, because it sounded like Work-Adaeze writing from Ada Pepper’s account, which was exactly the kind of behavioral inconsistency the system was designed to catch. She tried again: “This morning I found out that the AI at work has been reading my posts about plantain and decided I’m a bot. I am sitting on my kitchen floor eating leftover jollof with my hands and I have never been more human in my life. The jollof is cold and the pepper is still good and the AI can choke.”

She didn’t post that either. She put the phone face-down on the kitchen floor and listened to the apartment. The refrigerator hummed. A car passed outside, bass thudding through the floorboards. The apartment was a one-bedroom in a complex off Gorman Street, close enough to the NCSU campus that she could hear the marching band practice on Saturday mornings, and she’d signed the lease because the kitchen had gas burners and a window above the sink, two things her mother would have approved of, though her mother had never seen the apartment and probably never would.

The kitchen was where Ada Pepper lived. Not the blog — the blog was everywhere, on her phone, in the cloud, distributed across Verity’s own servers ironically enough — but the sensibility. The way she moved through the kitchen was different from how she moved through the rest of the apartment. In the kitchen she used her hands. She didn’t measure. She tasted and adjusted and tasted again, the way her mother did, the way her grandmother had done, a form of knowledge that lived in the body rather than in instructions. In the rest of the apartment she was organized, precise, the kind of person who made her bed with hospital corners and sorted her recycling by resin code. In the kitchen she was a different animal.

She picked up the phone again and opened the blog’s analytics page. Four thousand three hundred followers. Sixty-two comments on the most recent post — the one about the Whole Foods hot sauce situation. Average engagement rate: 4.7%, which was good, which meant real people were reading and responding, which meant the thing she wrote in her truest voice on her kitchen floor in her dead grandmother’s sitting position was reaching actual humans who cared about the temperature of tomatoes and the quality of iru and the particular heartbreak of eating jollof rice alone in a country that didn’t know what it was.

The system thought this was inauthentic. The system thought 4,300 humans connecting over pepper and loneliness was a behavioral anomaly.

The system wanted her to be one person. The system’s logic was clean and defensible and built on seventeen billion parameters and the accumulated judgment of a thousand labeled examples, some of which were hers. The system said: pick one. The professional or the food writer. The analyst or the Igbo girl on the kitchen floor. You cannot be both, because being both looks, to a machine trained on consistency, like deception.

But she was both. She’d always been both. And the question the system was asking — which one is real — was a question that could only be asked by something that had never had to be two things at once.


The next morning she came into the office early. The queue had thirty-seven new items. She poured her Enugu coffee and opened the first case and began to work.

The Ada Pepper flag was still there, blinking amber in the priority queue. She hadn’t dismissed it. She hadn’t escalated it. She’d left it in a state the system called “pending review,” which meant it would sit there for seventy-two hours and then auto-escalate to Lena, who would see the device fingerprint overlap and trace it back to Adaeze’s work phone and ask a question that Adaeze would have to answer in one voice or the other.

She had seventy-two hours. She didn’t know what she’d do with them. She opened the second case and began to review it with the careful, systematic attention of someone who was very good at deciding which patterns were real and which were manufactured, and who was beginning to suspect that the distinction didn’t hold, had never held, was a convenience built for a world where people stayed in one place and spoke one language and lived one life, a world that had never existed for anyone she knew.

At her desk, on her phone, a notification appeared. Ada Pepper had gained three new followers overnight. One of them had left a comment on the suya spice draft she’d accidentally published — the unfinished one, the one that cut off mid-sentence. The comment said: “Please finish this. I need to know about the kuli-kuli.”

Adaeze looked at the comment. She looked at the BCA flag. She looked at the confidence score: 0.91 now, climbing.

She picked up her thermos and drank the last of the coffee, which had gone cold but still tasted like the thing it was.