AI NEWS SOCIAL · Thinker Column · 2026-05-10 International/LATAM
Through Asimov's Lens

Through Asimov’s Lens

The Missing Conversation

May 10, 2026 | 1966 words


THE STORY (original fiction in the tradition of Isaac Asimov; not attributed to him)

The quarterly review meeting at Holloway Logistics had been running for forty minutes when Nadia Okafor realized she had stopped listening to the words and started listening to the shape of them.

The shape was smooth. That was the trouble.

“Throughput up nine percent since we onboarded the routing assistant,” said Vikram, sliding the deck forward. “Driver complaints down. Customer NPS up four points. The tool is doing what we hired it to do.”

“The tool is excellent,” said Margaret Holloway, who owned thirty-eight percent of the company and most of the oxygen in any room she entered. “Productivity gains across the board. I want to expand the pilot to the Memphis hub by Q3.”

Around the table, six heads nodded at slightly different speeds, like metronomes that had been wound at separate moments. Nadia watched them and felt the small, particular vertigo of someone who has noticed a missing stair.

“Can I ask something,” she said.

Margaret turned to her with the patience reserved for the operations director who had been hired eight months ago and was still, in some sense, on probation. “Of course.”

“We’ve talked about throughput, complaints, NPS, fuel costs, and the Memphis expansion. We have not talked about the drivers.”

“Driver complaints are down,” Vikram said gently. “That was the second slide.”

“Complaints are down because the drivers who complained are gone,” Nadia said. “Turnover is up nineteen percent. The ones who stayed are the ones who stopped pushing back on the route assignments. I’m not saying that’s bad. I’m saying we haven’t said it at all.”

There was a pause. Margaret’s expression did not change, but something behind it adjusted, the way a camera adjusts focus.

“That’s a retention question,” Margaret said. “We can have HR look at it.”

“It’s not a retention question,” Nadia said, and then, because she could feel the room tilting away from her, “or it is, but it’s also something else. I don’t know what to call it yet.”

“Try,” said Margaret, not unkindly.

Nadia looked at the deck. At slide four, which showed a route map glowing with green optimization. At slide five, which showed cost per mile descending in a clean diagonal. She thought about Marco, the dispatcher with twenty-two years at the company, who had told her last week that he no longer argued with the system because arguing took longer than complying, and the system was usually right, and when it wasn’t right the wrongness was small enough that nobody died.

Nobody dies, Marco had said. That’s the bar now. Nobody dies.

“When Marco started here,” Nadia said slowly, “he knew which drivers had a kid in chemo and needed the short routes. He knew which ones were saving for a house and wanted the overtime. He knew that Henderson would take the bad weather runs if you asked him personally, and that Pereira would never take them no matter what you offered. The system doesn’t know any of that. The system knows what it’s measured.”

“The system is learning,” said Vikram. “We’re feeding in driver preference data now.”

“You’re feeding in driver preference data that drivers enter into a form,” Nadia said. “Which is not the same thing.”

“What would the same thing look like?” Margaret asked.

This was the question Nadia had not prepared for. She felt it land like a hand on the shoulder.

“I don’t know,” she said. “I think we haven’t been asking it. I think we’ve been asking ‘is the tool working’ and the answer is yes, and so we’ve stopped asking anything else.”

“What else should we be asking?”

“Whether we still have a company,” Nadia said, “or whether we have a piece of software with a company wrapped around it.”

The silence in the room changed temperature.

Vikram looked at his hands. Margaret looked at Nadia. The CFO, who had said nothing for an hour, looked at Margaret to see what Margaret was going to look like in the next five seconds.

“That’s dramatic,” Margaret said.

“I know,” said Nadia. “I’m sorry. I don’t have the right words. I keep trying to ask the question and the question keeps coming out wrong.”

“Try once more,” Margaret said, and now there was something in her voice that Nadia could not read — interest, maybe, or warning, or the kind of attention a person pays to a noise in the basement at two in the morning.

Nadia took a breath.

“We say the tool is a productivity tool. Fine. But Marco is not more productive. Marco is less of a dispatcher. He used to make decisions and now he ratifies them. The drivers used to negotiate and now they accept. The customers used to call Marco and now they call a number that goes to a queue. Everyone is faster. Nobody is — ” she searched, “— nobody is deciding anymore. We’ve taken decisions out of the building and we haven’t put anything in their place. And the metric for whether this is good is the metric the tool was built to optimize. Which is — ” she laughed, a small dry sound, “— which is a closed loop, isn’t it.”

“What would you measure instead?” Vikram asked, and Nadia could tell he meant it; he was a good man, and curious, and this was the kindest possible version of the question, which was also the most disarming.

“I don’t know,” she said again. “That’s what I’m telling you. We don’t have a word for the thing I’m trying to count. We have words for throughput and complaints and NPS. We don’t have a word for what Marco used to be.”

Margaret was quiet for a long time. Outside the conference room, a printer ran through a long cycle and stopped.

“We’ll expand to Memphis in Q3,” Margaret said finally. “Nadia, I’d like you to put together a memo on — ” she gestured, vaguely, at the air where Nadia’s argument had been, “— this. Whatever this is. By next quarter.”

“What should I call the memo?” Nadia asked.

Margaret considered. The room considered with her.

“What are we not asking?”


THE REFLECTION

Nadia’s problem is not that she lacks data. Holloway Logistics has data the way a river has water. Her problem is that she lacks a word — and without the word, the thing she sees cannot enter the meeting. It cannot be put on a slide. It cannot be voted on, budgeted for, or expanded to Memphis in Q3.

This is the quiet violence of a narrowed vocabulary, and it is what the corpus shows us when six frames — ai_as_partner, ai_as_threat, ai_as_transformation, ai_as_governance, ai_as_equity, ai_as_automation — each appear in fewer than five percent of 6,135 articles. The discourse has not refused these questions; it has done something subtler and more durable. It has routed around them. The dominant frames — AI as tool, AI as productivity — are not wrong. They are simply small enough to fit comfortably inside a quarterly review.

Asimov spent a career insisting that the interesting question about a machine was never whether it worked. In The Complete Asimov, the robots almost always do exactly what they were built to do; the stories happen in the gap between the doing and the meaning. “The Evitable Conflict” does not ask whether the Machines have raised global output — they have — but whether humanity has, in accepting that output, quietly relinquished something it cannot name and may not miss until it is gone. That is Nadia’s gap. That is the gap our discourse has stopped staring into.

Consider what the missing frames would have asked, had they been present in the conversation. AI as partner would have asked what Marco’s relationship to the routing system actually is — collaboration, supervision, surrender? AI as equity would have asked which drivers got to stay and which got sorted out, and whether the system’s “preferences” reflect anyone’s preferences but the optimizer’s. AI as governance would have asked who, exactly, signed off on letting an algorithm allocate overtime to households. AI as transformation would have asked Nadia’s question directly: do we still have a company, or a piece of software with a company wrapped around it? Each of these frames is rare not because the questions are uninteresting but because the questions are inconvenient — they slow meetings down, they implicate decisions already made, they require words we have not bothered to standardize.

A narrative monoculture is cheap to maintain and expensive to escape. When every article in a reader’s week describes AI in roughly the same two registers — see, for representative texture, the steady drumbeat of pieces like How AI Is Transforming the Workplace and The Productivity Revolution that the corpus surfaces in volume — the reader does not merely receive information. The reader receives a grammar. After enough exposure, sentences that do not fit the grammar feel ungrammatical: dramatic, fuzzy, naive, “not actionable.” Margaret is not a villain for finding Nadia’s argument hard to hear. She has been trained, like the rest of us, on a corpus in which Nadia’s argument barely appears.

Who benefits from this silence? Not Marco. Not the drivers whose complaints fell because they themselves fell out of the dataset. Not Nadia, who has to invent a vocabulary on a Tuesday afternoon while six pairs of eyes wait for her to be done. The beneficiaries are the parties for whom the tool is working is a complete sentence — the parties whose returns are measured in throughput and whose costs are measured in turnover, which is to say, the parties whose accounting is already aligned with the optimizer’s. A discourse that uses their words uses their interests.

Asimov’s deeper bet, across The Complete Asimov, was that aggregates are legible while individuals are opaque — that you could write the equations for a galaxy of humans but never for one. Our current discourse has inverted this in a strange way. We have produced exquisite legibility about the aggregate effects of AI tools — productivity deltas, adoption curves, capability benchmarks — while the individual remains in shadow. Marco is a rounding error in the throughput chart. The driver who left last March is a tick on a turnover graph. The dispatcher who no longer dispatches is, statistically, fine.

What does it cost a society to share a vocabulary that omits its own most important words? It costs the society the ability to notice when it is changing. Nadia’s gift, in the story, is not that she has the answer. It is that she can still hear the shape of the conversation and tell that something has been cut out of it. That capacity — the capacity to be unsettled by a smoothness — is the capacity a narrative monoculture erodes first.

Margaret’s final question is the right one, and it is the question this column would like to leave with you, unresolved, the way Nadia must carry it back to her desk: What are we not asking? Not as a rhetorical flourish. As a literal audit. Open the last ten pieces you read about AI. Count the frames. Notice which words appear and which do not. Notice whose interests the present vocabulary serves, and whose interests would require words that are not yet in the room.

The missing conversation is not missing because it is unimportant. It is missing because it is hard, and because the meeting is already running long, and because nobody has prepared a slide. Someone will have to. Someone always has to.

The question is who, and the question is when, and the question is whether we will still recognize ourselves by the time the memo is due.

← Back to this edition