Halima's Algorithm

Combining Octavia Butler + Jose Saramago | Blindness + Automating Inequality


I. The Employment Algorithm

The system flagged Halima Hassan on a Tuesday in October, which is to say the system did not flag her at all, because the system did not flag people, the system assigned employment viability scores based on fourteen weighted variables including but not limited to work history continuity, skills currency index, industry demand coefficient, geographic mobility factor, and what the documentation referred to as “projected economic engagement trajectory,” which was a phrase that Nadine Kowalski had not written but had approved during the comment period when the vendor’s specifications document had circulated through her department for review, and Nadine had read the phrase and understood what it meant and what it would do and had found it neither objectionable nor remarkable, because it described, in the language appropriate to such descriptions, a function she had been performing manually for eleven years before the system was implemented.

Nadine processed workforce adjustment recommendations. That was her title’s verb, processed, and she held it close the way a surgeon holds “operate” — description and defense. She had started at the county employment office in 2021, back when applicants still came in person and sat across from her desk and explained their circumstances while she typed notes into a form. She remembered those conversations. Not the people — she could not have named any of them — but the texture. The pauses. The way someone would say “I was let go” with a particular downward inflection that contained the whole history of a job they had not wanted to lose.

The system eliminated the pauses. The system took the fourteen variables and produced a score between 0 and 100 and sorted the score into a tier — Tier 1, strong engagement candidate, eligible for full retraining benefits and job placement support; Tier 2, moderate engagement candidate, eligible for partial benefits and self-directed resources; Tier 3, limited engagement candidate, eligible for basic maintenance only; Tier 4, transitional candidate, flagged for cross-agency review. Halima Hassan scored a 31. Tier 3. The system noted her gap in employment — she had left her position at a medical billing company eight months earlier, the reason coded as “voluntary separation,” which was the code the system used when an employee resigned rather than being terminated, regardless of the circumstances that made the resignation something other than voluntary.

Nadine reviewed Halima Hassan’s file for eleven seconds. She remembered this later, not because she had timed herself but because the system logged review duration, and when the internal affairs investigator asked her how long she had spent on the file she was able to check. Eleven seconds. In eleven seconds she confirmed that the score was within expected parameters, that the tier assignment was consistent with the scoring methodology, and that no manual override flags had been triggered. She clicked CONFIRM. The file moved to the next stage of processing. Nadine moved to the next file.

She would later tell the investigator that she processed between sixty and eighty files per day. She would say this without apology, because it was true, and because the processing targets had been established by her supervisor based on the vendor’s throughput recommendations, and because the alternative was a backlog, which would have meant delayed services, which would have meant people waiting longer for assistance.

What Nadine did not know — what the system did not capture, because the system captured only what the system had been designed to capture — was that Halima Hassan had left her job at the medical billing company because her daughter’s school had shifted to a hybrid schedule that required a parent or guardian to be present during afternoon hours on Tuesdays and Thursdays, and Halima had asked her employer for a modified schedule and her employer had said no, and Halima had asked again and her employer had said the company’s remote work policy did not accommodate the modification she was requesting, and Halima had weighed the job against her daughter’s education and had chosen her daughter, which the system coded as “voluntary separation” because the system did not have a code for “a woman chose her child over her paycheck and the country she lived in made her pay for that choice with her life.”

The system did not have a code for that because no one had asked it to.


II. The Insurance Model

Raymond Okafor did not know Halima Hassan’s name. He knew her policy number, which was different. He knew her policy number because it appeared in a batch of 340 accounts that his team had flagged for premium recalibration during the quarterly risk assessment cycle, and her policy number was among the eighteen in the batch whose recalibrated premiums exceeded the affordability threshold that the company used internally — not publicly, the threshold was never published, never referenced in member communications, it existed only in the actuarial models — to predict coverage discontinuation. Meaning: the model predicted that Halima Hassan, upon receiving her new premium, would be unable to pay it. Meaning: the model predicted that she would lose her coverage. Meaning: the model had, in a sense, decided that she would lose her coverage, and then produced the premium that would cause her to lose her coverage, and then presented this premium as the result of neutral risk calculation.

Raymond did not think of it this way. Raymond thought of it as math. He was an actuary with a master’s in statistics from a university whose name he included in his email signature. He had been practicing for nine years and he was good at his work, which meant that his models were accurate, which meant that when his models predicted coverage discontinuation they were usually right, which meant that discontinuation was already a fact by the time the premium notice arrived. The premium notice was not a decision. It was a consequence of calculations made upstream, using data collected further upstream, reflecting conditions established even further upstream, and at no point in this chain was there a person who said, Halima Hassan should lose her health insurance.

Raymond had a term for what he did: Lichtenberg work. He had picked it up from a physics documentary — Lichtenberg figures, the branching patterns that electricity burns into surfaces when it discharges, splitting and splitting into smaller channels until every point on the surface is at the end of its own unique filament. Risk assessment was like that. You started with a population and you sent current through it and the current found the paths of least resistance and it branched until every individual was alone, scored, priced. He kept a Lichtenberg figure as his desktop wallpaper — a block of acrylic with a frozen lightning bolt trapped inside, purchased from a novelty science shop.

Halima Hassan’s premium increased by 34 percent. She received the notice on a Thursday. She called the member services number printed on the notice and waited forty-two minutes to speak with a representative who confirmed that the increase was correct, that it reflected her current risk profile, and that if she wished to contest the assessment she could submit a written appeal with supporting documentation to the address provided on page three of the notice. The representative was polite. The representative was reading from a script that had been optimized by a natural language processing tool to minimize call duration while maintaining a customer satisfaction score above 3.8 on a 5-point scale.

Halima Hassan did not submit a written appeal. She let the coverage lapse. She had $340 in her checking account and the new monthly premium was $487 and the math did not require an actuary. She told her daughter Farida that they would be fine, that she had been healthy her whole life, that insurance was for people who got sick and she was not going to get sick. Farida, who was twelve and already fluent in the particular grammar of her mother’s reassurances, said okay. She did not ask follow-up questions. She had learned, from watching her mother navigate systems designed for people with more time and more money and more English, that some questions were doors, and behind the doors were rooms that were locked.

In the actuarial model, the lapse was recorded as a successful prediction. Raymond’s quarterly accuracy report noted an 89 percent concordance rate between predicted and actual coverage discontinuations. His supervisor flagged the number in a positive performance review.


III. The Scheduling System

Dr. Elena Varga did not build MedAssist. She configured it. This was a distinction she made carefully, repeatedly, with the precision of someone who had been asked the question before and anticipated being asked again. She had been hired by the county health network to implement the patient scheduling optimization platform — MedAssist — across fourteen clinics, and implementation, she would explain to anyone who asked, meant configuring the system’s parameters to reflect the network’s operational realities, which included provider availability, facility capacity, equipment access, and what MedAssist’s documentation called “patient acuity weighting.”

Patient acuity weighting was the part that mattered. MedAssist assigned each incoming appointment request a priority score based on the patient’s clinical data, insurance status, and — this was the variable that Elena had debated, internally, with herself, in the car on her way to work, for exactly one commute before deciding — economic viability index. The economic viability index was not Elena’s creation. It came bundled with the software. It was a composite score that drew from employment records, insurance tier, zip code median income, and historical utilization patterns to estimate the “resource recovery likelihood” of each patient encounter, which was the vendor’s way of saying: how likely is it that the system will be reimbursed for this person’s care.

The system did not refuse care to anyone. Elena was firm on this point. The system scheduled care for everyone. It simply scheduled some people sooner than others. A patient with a high acuity score and a high economic viability index might be seen within three days. A patient with a high acuity score and a low economic viability index might be seen within three weeks. Both patients were scheduled. The difference was time. And time, in a medical context, was sometimes the difference between a condition that was treatable and a condition that was not, but this was a clinical reality, not a scheduling decision, and Elena had been clear in her implementation documentation that MedAssist’s scheduling optimization should not be interpreted as a clinical triage system, even though it functioned as exactly that.

Halima Hassan requested an appointment on November 2nd. She had been coughing for three weeks. She had chest pain when she breathed deeply. She had no insurance — her coverage had lapsed in September — and she was currently unemployed, coded Tier 3 in the employment system, which MedAssist accessed through the county’s integrated data platform. Her economic viability index was 0.23 on a scale of 0 to 1. MedAssist scheduled her for November 21st. Nineteen days.

Elena did not see Halima Hassan’s name. She did not see any patient’s name. She saw dashboards. Utilization rates, scheduling efficiency metrics, wait-time averages stratified by acuity tier. The dashboards looked good. The wait-time averages were within the targets that the county health network had established, which were based on the national benchmarks that the federal government had published, which were based on the data that systems like MedAssist had generated.

During those nineteen days, Halima’s cough worsened. She bought cough syrup at a dollar store — the off-brand kind, sticky and cloyingly sweet, in a plastic bottle with dosage instructions printed so small she had to hold it under the kitchen light to read them. She took it three times a day as directed. She propped herself up on two pillows at night because lying flat made her feel like she was breathing through a wet cloth. Farida pretended not to hear her mother coughing at 2:00 a.m. and Halima pretended not to know that Farida was pretending. They were both good at this. They had been practicing for years. The apartment was small enough that pretending required discipline — a deliberate effort not to acknowledge sounds that traveled through walls as thin as intentions.

On November 14th, Halima woke gasping. Her lips were blue. Farida, standing in the bedroom doorway in an oversized T-shirt, said Mama and Halima said I’m fine, just a bad dream, and both of them knew this was not true but neither of them had the resources — emotional, financial, systemic — to make the truth actionable.


IV. The Claims Processor

Joyce Butera had worked in medical billing for twenty-six years and had learned to read a claim form the way a conductor reads a score — not word by word but as a whole composition, each code and modifier and date of service contributing to a pattern that was either harmonious or dissonant, and if it was harmonious the claim moved forward and if it was dissonant it was flagged for review and review meant delay and delay meant, in the specific vocabulary of the industry, “pending.” Joyce had processed more than two hundred thousand claims in her career. She had never met a patient. She worked from her kitchen table in a house in Westfield, New Jersey, wearing the same cardigan she had worn to her job at the insurance company before the company outsourced its claims processing to a firm that hired remote workers as independent contractors, which meant that Joyce now did the same work she had always done but without health insurance of her own, a fact she recognized as ironic briefly, painfully, and then moved past.

The claim that would have been Halima Hassan’s arrived on Joyce’s screen on November 19th, which was two days before Halima’s scheduled appointment, which was too early for it to have been generated by the appointment itself. It was a pre-authorization request from the clinic, seeking approval for diagnostic imaging — a chest X-ray and CT scan — based on a telephone triage assessment that the clinic had conducted on November 17th when Halima had called because she could not breathe while lying down and the triage nurse had told her to go to the emergency room but Halima had said she could not afford the emergency room and the nurse had said then come in early, we’ll try to fit you in, and Halima had said she would try.

The pre-authorization request was submitted under Halima’s old policy number, the one that had lapsed in September. Joyce flagged it. Invalid coverage. The system generated an automatic denial. Joyce did not override the denial because overriding a denial required supervisor approval and documentation and a reason code from a list of fourteen approved reasons, none of which included “the patient might need this.” The list included “coverage verified through alternate source,” “retroactive eligibility confirmed,” “coordination of benefits resolved,” and eleven others, all of which required documentation that did not exist because Halima Hassan did not have coverage. The denial was correct. The denial was the only action the system permitted. Joyce processed it in four seconds, which was two seconds faster than her average, and moved to the next claim.

Joyce worked until 6:00 p.m. that day and processed 147 claims and when she was finished she closed her laptop and made dinner — pasta, jarred sauce, bagged salad — and ate it standing at the kitchen counter because the table was her workspace and she did not like to eat where she worked, a boundary she maintained with the same rigor she applied to the claims themselves, because the work was easy, the work was so easy that the ease itself was the problem, the frictionless glide of one claim to the next, each one a small packet of someone’s pain translated into codes and modifiers and dollar amounts, and Joyce’s job was not to feel the pain but to read the codes, and the codes said denied, and denied was the answer.

Halima Hassan went to her appointment on November 21st. She was seen by a physician’s assistant who noted decreased breath sounds in the left lower lobe, tachycardia, and bilateral lower extremity edema. The PA ordered the imaging that had already been denied. The imaging was performed that afternoon because the clinic had an internal protocol for uninsured patients requiring urgent diagnostics — a workaround that Elena Varga had helped design, one of the small human patches that kept the machine from being as efficient as it wanted to be. The CT scan showed a large pleural effusion and a mass in the left lung consistent with malignancy. Halima was referred to oncology. The oncology referral went through MedAssist. MedAssist scheduled her for December 14th. Twenty-three days.

Halima Hassan died on December 8th.

She died at home, in a bed she shared with her daughter Farida, who woke at 4:00 a.m. because the room was quiet. Not silent — silence is the absence of sound, and the room was not absent of sound, the refrigerator hummed in the kitchen and the radiator ticked and somewhere outside a truck was backing up with that pulsing alarm — but quiet in a way that Farida recognized without understanding, the way an animal recognizes the change in air pressure before a storm. Her mother’s chest was not moving. Her mother’s hand, which Farida had been holding when she fell asleep, was cold. Not cool. Cold. The temperature of something that has stopped generating its own heat.

Farida called 911 at 4:07 a.m. The dispatcher walked her through CPR instructions. Farida pressed her hands against her mother’s sternum and pushed, one-two-three-four, the way the dispatcher counted, and she pushed for seven minutes until the paramedics arrived and they pushed too, harder, with a machine, and then they stopped.

The cause of death was respiratory failure secondary to malignant pleural effusion. The medical examiner’s report noted that the condition, if diagnosed and treated six to eight weeks earlier, would have been manageable. Not curable. Manageable.


V. The Compliance Officer

David Alcantara received the incident report on December 11th and opened it at his desk with his first coffee, which he drank black, two sugars, from a ceramic mug his daughter had painted for him at a birthday party when she was seven and which now bore a crack along the handle that David had repaired with superglue.

The incident report concerned the death of a patient named Halima Hassan. It had been routed to David through the county’s Critical Incident Review System, which flagged deaths occurring within thirty days of a system-scheduled appointment when the patient’s acuity score at time of scheduling exceeded a threshold that David himself had helped calibrate. He was the compliance officer for the county’s integrated services platform. His job was to determine whether system processes had been followed correctly. Not whether the outcome was acceptable — that was a clinical question. Not whether the system should be redesigned — that was a policy question. Whether the processes had been followed. Whether each person in the chain had done what the system required them to do.

He reviewed Nadine Kowalski’s employment assessment. Process followed. Score calculated correctly. Tier assignment consistent with methodology.

He reviewed Raymond Okafor’s premium recalibration. Process followed. Risk model applied correctly. Premium increase within actuarial guidelines. Coverage discontinuation predicted and confirmed.

He reviewed Elena Varga’s scheduling configuration. Process followed. Acuity weighting applied correctly. Economic viability index factored per vendor specifications. Appointment scheduled within network wait-time targets.

He reviewed Joyce Butera’s claims processing. Process followed. Pre-authorization denial correct. Coverage had lapsed. No valid override codes available.

Four processes. Four correct outcomes. One dead woman.

He paused. He opened the patient’s file — not the process file but the patient’s file, the one that contained her intake form, her clinical notes, her demographics. Halima Hassan, age 39. Occupation: none (Tier 3). Dependents: one (minor female, age 12). Address: 414 Prospect Avenue, Unit 3R. Emergency contact: none listed. There was no photograph. The system did not collect photographs. There was only the data: height, weight, date of birth, social security number, the particular set of numbers that constituted a person’s administrative existence, the digital body that the system processed in place of the physical body that the system never had to see.

David had a daughter. She was nineteen, in college, studying something he couldn’t quite keep track of — environmental something, policy something. She called him on Sundays. She had health insurance through his plan. She had never waited nineteen days for an appointment. His daughter was inside one set of systems and Farida Hassan was inside another, and the word for that difference, if you were being honest, was — but David was writing a compliance report, and compliance reports did not have a field for honesty, they had fields for findings and recommendations and corrective actions, and his findings were that no corrective actions were required.

David wrote his report. He wrote that the Critical Incident Review had determined that all system processes had been followed in accordance with established protocols and that no individual procedural failures had been identified. He wrote that the patient’s outcome, while regrettable, was not attributable to a system error. He wrote that the review was complete and that no corrective actions were recommended.

He wrote this in the same room where he ate lunch, the same room where he took calls from his daughter, and he sat for a moment longer than necessary because something was happening in his chest that he could not name — not pain, not pressure, something more like the awareness of a space that should have contained something and did not. He thought about the Swiss cheese model from a compliance training years ago — catastrophic failures occur when holes in multiple defense layers align. The model assumed the holes were flaws. But in this case the holes were the system working correctly. He almost followed the thought further. He did not.

He closed his laptop. He rinsed his mug. He drove home in traffic that moved slowly through intersections controlled by signals optimized for peak throughput, and he stopped when the light was red and went when the light was green, and the space in his chest where the unnamed feeling had been was already closing, and he let it close.

Farida Hassan turned thirteen in January. She applied for survivor benefits through the county’s integrated services platform. The system processed her application in four business days. Her case was assigned a number. Her file was opened.