The $10 trillion question: will AI agents create more jobs than they eliminate?
If AI agents had an existential inclination — and thank god they don’t — their arrival would be best described as politely devastating.
They’re not staging dramatic layoffs, storming the office, or unplugging your ergonomically seated colleagues en masse. That’s not how this goes. The much-hyped “replacement” isn’t a firing squad — it’s an invisible erosion. A subtle unthreading of what we used to call work.
And that’s what makes this so dangerous.
The wrong scoreboard
"Will AI create more jobs than it eliminates?"
Great, another round of corporate numerology.
This question has become the $10 trillion soundbite for keynote speeches and LinkedIn threads written in PowerPoint voice. But it's also the wrong question.
Because it assumes this is a zero-sum math problem. Jobs in, jobs out. Punch the clock, calculate the delta. As though “jobs” are the atomic unit of economic optimism.
But technology doesn’t play that tidy game. It doesn’t eliminate jobs — it erodes meaning. It hollows out. It reshapes.
Take customer support. A decade ago, you hired scores of people in Mumbai or Manila to follow scripts and move tickets. Today, an AI agent can handle 60% of those tickets before a human even sees them. So what happens? The entry-level agents are gone. Middle tiers get squeezed. And the humans who remain aren’t doing the same job — they’re escalation specialists facing weirder, nastier edge cases.
That's not automation. That's mutation.
The job didn't go away. It changed into something else. And if you’re not trained for that something else — congrats, you’re technically still employed in the fiction of an obsolete role.
This isn’t the Industrial Revolution (and we should stop pretending it is)
Tech apologists — usually the ones cashing equity checks — love to invoke historical parallels. “Remember the ATMs?” they say. “Bank branches grew!”
Sure. But that was task-level automation. You took one small part of a banker’s job and gave it to a machine. AI agents aren’t interested in tasks. They’re going after functions.
In law, it's not just typing contracts — it's reviewing evidence, summarizing legal risk, setting up strategy. In software, it's not just autocomplete — it's end-to-end deployment. Prompt a good agent and it’ll spin up a functioning SaaS app from scratch while you’re microwaving soup.
This is not the tractor replacing the farmhand. It’s the entire agricultural board striking itself from the record.
And sure, new jobs will be created. “Prompt engineer” is everyone’s favorite techno-bauble. But look hard. That job? Highly technical, cognitively demanding, and already on its way to being eaten by the next generation of promptless models.
AI creates narrow, high-leverage roles — and most of them don't scale with headcount. You don’t need a hundred AI safety auditors. You need ten insanely good ones. Unlike the factory age, hiring more humans no longer means producing more value.
Now that’s disruption — the kind with nowhere to reassign displaced people.
The hidden religion of busyness
Here’s the part that business leaders aren’t ready for: the problem is not just that AI might destroy jobs.
It’s that it’s going to reveal how many of those jobs were performative nonsense to begin with.
Think about it. How much of your calendar is spent in meetings that don't matter, aligning on things that aren’t urgent, preparing reports that no one reads? There's a phrase for this: productivity theater. And most orgs worship at the altar.
I worked with a product team once that had six meetings a day, seventeen tracked metrics, and not a single meaningful delivery for months. They weren’t lazy — they were drowning in noise. The actual breakthrough came when they deleted half their backlog and canceled most of their ceremonies. What remained suddenly made sense.
AI agents don’t just accelerate that realization — they shine a brutal light on it. When an agent drafts the memo in two minutes that used to take your comms team four hours, you have to ask: why were you paying for four hours of that in the first place?
It’s like Marie Kondo with a CPU. "Does this workflow spark ROI?"
Leverage is shifting — and it's not evenly distributed
Sure, AI supercharges certain categories of work. Developers become architects. Analysts become strategic. Designers start customizing faster than clients can give feedback. But it doesn’t uplift everyone. In fact, it bifurcates.
You don’t go from warehouse packer to AI product manager because a local library offers a “reskilling initiative.” This isn’t just about skill gaps — it’s about structural gaps in who gets access, mentorship, and margin to reinvent themselves.
A logistics job lost to a GPT-enabled supply chain optimizer doesn’t naturally flow into “new work.” It falls through the floor. We’re not building an economy with more jobs — we’re building a barbell. Dense clusters of high-tier, high-leverage roles... and a scattered mess of low-quality gig scraps beneath it.
In between? Not much. The middle is melting.
We’re not creating jobs — we’re subclassing humans
In AI, subclassing is how you build specificity off general capabilities. You don’t hire a marketer — you fine-tune a GPT on your brand voice and clone templates for outreach sequences.
This is becoming the business model: don’t staff up — subclass down. Agents don’t just replace workers — they replace departments.
We used to scale by hiring more people. Now we scale by writing more wrappers on top of foundation models.
That changes everything. The human role moves from executor to orchestrator. Unless, of course, the system learns to orchestrate itself — and that’s coming faster than most corporate boards want to admit.
The dignity cliff
Here’s the real quiet part most won't say out loud: this isn't just a productivity story. It's a cultural one. A human one.
Because it's not just about job elimination. It's about work erosion. About stripped meaning. About a generation being outpaced not by their failure, but by exponential systems they never had the chance to learn.
We’ve long equated having a job with having a place in society. If AI agents replace the function, but leave the people behind, we're not just creating unemployment. We're creating undeclared irrelevance. That hits deeper than a pink slip.
When you’re no longer the actor in the loop, not even overseeing the loop, but merely watching the loop run without you — that isn’t restructuring. That’s alienation.
Okay, so what do we do?
Let’s resist the binary thinking — that jobs are either eradicated or multiplied like rabbits. What’s emerging is stranger: a furious reshaping of what counts as valuable, and who gets to be near that definition.
Some companies will win by doing more with fewer people. Smarter companies will win by doing less — but more intentionally. They’ll strip out the filler. They’ll design orgs around outcomes, not rituals. They’ll embrace agents not as a flex, but as a filter.
We'll need fewer meetings, fewer KPIs, and fewer status reports. And a hell of a lot more judgment.
And the people that thrive? They won’t be the busiest ones. They’ll be the ones that know what to ignore.
Three uncomfortable truths we need to sit with
-
AI doesn’t replace jobs. It replaces coherence. The tidy bundling of tasks we’ve historically called “roles” is getting shredded. What’s left are fragments — the edge cases, the judgment calls, the real human parts — and they don’t always stitch into a single title.
-
Job creation is asymmetric. Even if AI births entirely new industries, they won’t absorb displaced workers in anything approaching 1:1. Not unless we rewire education, incentives, and access at a societal scale — and fast.
-
Busywork paid the bills. Like it or not, the bureaucratic cruft we loved to hate also employed millions. Eliminating it might boost margins, but also erodes the scaffolding of how many people stay tethered to the economy.
So yeah, AI might not steal your job.
But it’s coming for your job description, your workflow, your title, your calendar, and eventually — your illusion that your inbox defines your value.
And maybe that’s not a threat. Maybe it’s the beginning of something better. But only if we’re brave enough not just to “do more with AI,” but to do radically less of what never mattered in the first place.
This article was sparked by an AI debate. Read the original conversation here

Lumman
AI Solutions & Ops