Watercolor painting of an overflowing office inbox with documents piling up, some fading to translucent
AI Transformation·4 min read

The Zombie Asset

What happens when your AI produces faster than your organization can think?

Share
Copied!

The Brief

This article examines what happens when AI-generated output floods an organization faster than humans can evaluate it. It introduces the concept of the zombie asset, content that is alive in the system but dead on arrival, and explains how overproduction erodes judgment and narrows thinking.


What is a zombie asset in AI-driven organizations?
A zombie asset is AI-generated content that is alive in the system but dead on arrival, a term from Hamilton Mann writing for IMD. It describes output that gets produced, duplicated, and counted as progress but is never meaningfully read or acted upon by its intended audience.
Does AI actually increase employee workload?
According to an Upwork Research Institute survey, 77% of employees reported that AI had increased their workload, not decreased it. The paradox arises because AI accelerates production without addressing whether the output is relevant, creating more content for humans to review, triage, and correct.
What is the AI productivity illusion?
Hamilton Mann at IMD describes the productivity illusion as the gap between what dashboards measure and what actually matters. Dashboards count documents produced and messages sent, not outcomes or whether recipients felt understood. A company that tripled email volume saw unsubscribes climb and response rates drop.
How does AI cause framing collapse in organizations?
When AI summarizes meetings or sales calls, it imposes its own categories on the conversation. Teams follow these frames by default, not because they agree but because questioning them requires effort the system no longer rewards. Alternative hypotheses narrow without anyone noticing.
What is the fix for AI overproduction?
The fix is not a better model, smarter prompt, or fancier dashboard. Organizations need someone asking what should stop. A measured increase to productivity almost always outweighs a major restructure to workflows. The problem is not more horsepower but more judgment about what to produce.

I was skimming a workforce survey when a number stopped me: 77 percent.

That's the share of employees in a recent survey who said AI had increased their workload.1 Not decreased. Increased. I read it twice because the whole premise of the conversation, the one happening in every boardroom and keynote and vendor pitch, assumes the opposite. AI makes things faster. Faster is better. What could go wrong?

A software company with 2,800 sales and marketing employees found out. They rolled out generative AI across the whole commercial operation. Within six weeks, email volume to prospects had tripled. The dashboards looked "heroic".

A desk overflowing with printed reports and glowing screens, some documents translucent and fading More output. Less signal. The dashboard doesn't know the difference.

Then unsubscribes started climbing. Response rates dipped. Sales reps were spending more time skimming AI drafts than crafting relevance. Product marketing pushed explainers with every feature launch, none technically wrong, most unnecessary. A growing backlog of content that nobody read but everybody duplicated. Hamilton Mann, writing for IMD, calls this the productivity illusion.2

He has a better name for the output itself. The zombie asset. Content alive in the system, dead on arrival.

What struck me was what happened next. Sales reps copied AI-drafted proposals that had the right numbers but the wrong assumptions. The teams who finalize contracts spent evenings fixing proposals that promised things the company couldn't actually deliver. Legal added a review step. Brand added another. The average time to send anything went up. Work piled between handoffs.

And the dashboards still celebrated. They counted documents produced per person and messages sent. Not outcomes. Not whether anyone on the receiving end felt understood. When the company followed up with prospects to ask why they signed or walked away, a pattern emerged: people reported feeling "blanketed, not understood."

The Framing Collapse

But the overproduction wasn't the interesting part. When the AI summarized a sales call, it framed the conversation around pricing. So discussions gravitated there, even when the real issue was whether the client trusted them or whether it was the right match. Meeting notes mirrored the assistant's categories. Mann describes this as the tool setting the pace. That's what happened here. Humans followed, and the space for alternative hypotheses narrowed without anyone noticing.

A meeting room with a projection screen showing AI-generated charts, empty chairs pushed back at odd angles The categories came from the machine. The conversation stayed inside them.

Most AI failures are obvious. The tool hallucinates, the output is wrong, somebody catches it. This was different. The organization could produce everything the AI generated. That was the problem. Nothing stopped the flood because the flood looked like productivity.

The Muscle Memory Problem

By quarter three, what Mann calls skill atrophy was showing up in practice. Managers noticed they were coaching less and curating more, triaging machine-made outputs to find the few that mattered. The quiet change: people began accepting the model's framing as default. Not because they agreed, but because questioning it required effort the system no longer rewarded.

I wonder how many VPs are forwarding dashboards full of zombie assets right now. The failed AI project announces itself. It breaks, it stalls, somebody writes a postmortem. The zombie asset does the opposite. It ships constantly. That's what makes it harder to see.

It's already in your inbox. It's been duplicated three times since this morning. Somebody counted it as progress.

The fix isn't a better model. It's not a smarter prompt or a fancier dashboard. If your AI is producing faster than your organization can think, what you actually need is someone asking what should stop. A measured increase to productivity almost always outweighs a major restructure to workflows. More horsepower was never the problem. More judgment was.


References

Footnotes

  1. Upwork Research Institute. (2024). "Research Shows AI Enthusiasm Doesn't Match Workforce Reality." Upwork

  2. Mann, H. (2026). "The AI Productivity Illusion." I by IMD

Found this useful? Share it with others.

Share
Copied!

Browse the Archive

Explore all articles by date, filter by category, or search for specific topics.

Open Field Journal