Work is the thing we do when we are not doing what we want.
That’s the implicit definition that shaped the industrial centuries.
Work was factory belts, desk chairs, spreadsheets, and morning commutes.
But as the 20th century progressed, work took on a new meaning.
As machines replaced muscles, labor drifted upward — out of the fields, away from the foundries, and into the air-conditioned theater of the mind.
Intelligence (plus motivation, plus applying it in the right direction to the right problems) became the new throughput — not pounds lifted per square inch, but decisions made and implemented per hour.
Suddenly, the most valuable workers weren’t the strongest, but those who could manipulate abstract symbols at scale: memorandums, contracts, spreadsheets, market forecasts, pitch decks.
The farmer’s hands gave way to the typist's.
What had once been a rarefied activity — the work of mandarins, viziers, generals, and imperial bureaucrats — became the daily routine of millions.
For most of history, “thinking for a living” had been an extremely niche practice, reserved for elites in robes and brass: court advisors in Han China, scribes in the Roman Empire, Catholic cardinals, strategists of the Ottoman court, and, more recently, the officer class of the British Empire or the logistical masterminds behind two world wars.
But in the second half of the 20th century, that managerial style of work — planning, organizing, delegating, predicting — was democratized.
The white-collar worker was born.
Millions entered offices armed not with swords or seals, but with typewriters, blueprints, calculators, and eventually laptops and smartphones.
They didn’t govern empires, but they learned to govern information.
They moved paper.
They moved money.
They moved meetings.
In this world, work became semiotic.
You wore seriousness on your body: pressed suits, sensible shoes — later, Patagonia vests zipped over blue Oxford shirts.
You sold nothing, built nothing, moved nothing — but you moved people and capital.
You “managed.”
You created leverage by making decisions about other decision-makers and the placement of capital into the right accounts.
This kind of work felt elevated: a realm where applied thought itself was the product, and the worker’s greatest tool was the email outbox.
But if you strip away the rituals — the posture, the forklift, the construction site, the CT scan, the surgery, the military expedition, the fisherman’s quota, the litigation meeting, the laptop, the calendar invites — and look at the raw behavior underneath… what exactly is work?
Ask most people, and they’ll reach for verbs.
Work is doing.
Work is grinding.
Work is producing.
Work is strategizing.
Work is communicating.
Work is crossing off a list.
That’s because we’ve been conditioned to equate effort with value.
Sweat equals virtue.
But the question is no longer rhetorical.
When artificial intelligence begins to do — when it can write emails, debug code, design logos, summarize reports — the old definitions collapse.
And when that intelligence is given a mechanical body (not just humanoid, but birdlike, mammalian, or automotive), when it can lay bricks, water gardens, stock shelves, repair neurological damage, knit sweaters, repair underwater pipe leaks, scale mountains, deliver cargo — the “physical doing” isn’t human anymore either.
So then: what’s left?
What does work become when the friction is gone?
We need to separate the sacred from the automated.
We need to ask whether our human efforts in the 2030s will still be labor, strategy… or something new.
Labor and strategy
In the year 880, on the edge of Christendom, a man lifts stone for a monastery wall. He is one of thousands in the region doing similar work — farming, carving, hauling, obeying.
The laboring class numbers in the millions, across Europe, the caliphates, Asian empires, and beyond. Their work is physical, repetitive, and bound to land or lord.
And who decides?
A few hundred in each kingdom, at most.
The local baron, the bishop, the abbot — literate men of title who issue commands in Latin or decree. They are the strategists, not because they are wiser, but because they are permitted to choose. Thought is not democratized. To be a decider is to be set apart, not just in education, but in birth and power.
Labor builds the world.
Primarily local strategy decides what is built and why.
A few dozen people in the entire world — Byzantine emperors, Tang diplomats Abbasid caliphs, Catholic popes, Carolingian kings and the link — undertook work you could call "strategy" on a scope exceeding the regional limits of a local township.
A few thousand merchants at strategic ports and along silk roads might specialize in strategy related to a very limited concept of 'global trade' that required them to 'think strategically and coordinate work' outside their city.
Practically no one worked strategically. Practically everyone labored.
Fast forward to 1880. The laborer is laying rail across the American West.
He swings a hammer all day, for low pay, under dangerous conditions.
Unlike the monastery worker, this man has a wife. She must do domestic work from sunup till sundown. She has the some of the benefits of electricity, but she doesn't work fewer hours than someone in 880 AD. It's possible she works even more. Artificial light allows people to theoretically do more, especially in winter.
There are hundreds of thousands like them — the men working as miners, builders, factory workers — all feeding the machine of industry.
The work, if it ever was, is no longer sacred; it is systematized.
Paid by the hour, managed by the shift.
The railroad worker has no relationship with the person whose capital is funding the railway. Layers of management exist between him and the robber baron.
Because something has changed.
In Lower Manhattan, in downtown San Francisco, new rooms have appeared: offices.
Not for the Rockefellers, but for their lieutenants.
Clerks, schedulers, foremen, accountants.
A thin but growing managerial class.
These are not capital owners, but they aren’t laborers either. They don’t build the railroad — they help coordinate the building.
They relay orders, organize payroll, track shipments, smooth the frictions of scale.
Strategy still belongs to a few hundred industrialists. But now tens of thousands more have begun to taste its edges — not making the biggest decisions, but interpreting them, carrying them downward and upward like messengers in a growing bureaucracy.
By 1980, this intermediary class has multiplied. Now there are millions of Americans who do not labor in the traditional sense.
They sit at desks. They write memos. They manage people who manage other people.
The service economy has exploded. So has the idea of "knowledge work." And with it, a new confusion.
Everyone wants to be a thinker now. Everyone wants to be strategic.
The first boomers are beginning to look at their children and think to themselves: "You need to get an education so you don't have to work in a factory, or at McDonalds for a living".
The reality is ambiguous.
Most corporate employees in 1980 are not inventing ideas. They are forwarding them. They are implementing plans built by someone else, somewhere else, higher up.
Much of it remains labor dressed up as strategy.
The distinction still exists, but it is harder to see.
The old world had clear roles: the mason, the monk. The capitalist, the clerk.
Now the boundaries are blurred. Everyone sits at the same kind of desk, but not everyone is making real decisions.
Across these three centuries, the pattern holds: far more people labor than decide.
The shortcut generations
By 1980, the ladder from labor to strategy was already crowded. Millions of Americans no longer worked with their hands but with words, approvals, and systems.
The managerial class had become a dominant force, tasked with coordinating complexity. Yet the hierarchy still held: most were executing decisions made by others, not originating strategy themselves.
But around this time, a subtle shift began.
The computer entered the workplace not just as a glorified typewriter, but as a tool for building. At first, it was domain-specific: engineers, analysts, accountants.
But slowly, it became programmable. And those who could program it discovered something strange: they no longer needed to rise through the ranks to make decisions. They could create their own systems. Build their own companies.
By the 2000s, this had become a pattern. A teenager with a laptop could write code that touched millions. A small team could build a product in a dorm room and sell it globally without asking permission.
This wasn’t just technological disruption—it was organizational inversion. The ladder had a hole cut through the bottom rung.
In the past, young people had always had to wait their turn. They apprenticed, climbed, endured.
But now, if you understood how to speak the language of the machine, you could skip the line. Strategy was no longer handed down; it was compiled.
In some cases, you didn't even have to personally speak the machine language, you just needed to know how to sell a vision that you could organize and direct the work of people who did.
This period, roughly from the 1990s through the 2010s, created the mythos of the young founder. Zuckerberg in his hoodie. Jobs in his garage. Gates dropping out.
While many were the children of privilege, their power came not totally from inheritance but from technical fluency.
They understood something the older generation did not. They knew how to talk to the computer.
This was the essence of the disruption. It wasn’t just the products they built. It was the structure they bypassed.
And with that, they built companies full of people like themselves—young, ambitious, and unencumbered by hierarchy. Few people over 45 worked at these startups at first, not because of discrimination, but because the entire organizational logic had changed.
For a generation, this looked like a new rule: the young disrupt the old, because they know the next tool first.
But every rule has its context. And in hindsight, the idea that "the young disrupt the old" may not have been a rule at all. More on this later.
What is work worth?
A fence needs repairing. A man is hired. He shows up with tools, works for three hours in the sun, hammering, sawing, re-aligning the boards. When he finishes, you pay him.
The cost feels fair. You saw him labor. You heard the tools, felt the dust. You trust the effort. The value seems justified by the exertion. The fence is in great shape now.
But what if he hadn’t finished the job?
What if he worked just as hard, for just as long, but left the gate crooked and a few boards loose? You wouldn’t pay the full amount. Maybe not at all. He might have "labored hard" but he didn't "do the work".
So: effort alone isn’t enough. The work has to work.
But now imagine another case. He does fix the fence — perfectly — but he finishes in one hour, not four.
You agreed to $400 because he said it would take half a day. You still get the fixed fence. But you feel... tricked. Like you overpaid.
Why?
If the outcome is the same, why should speed make us uncomfortable?
Shouldn’t we admire efficiency?
We don’t, always.
Because somewhere in us, we believe work should feel hard to be worth something. That belief didn’t come from nowhere.
Adam Smith warned that “labor... is the real measure of the exchangeable value of all commodities.”
Hannah Arendt argued that modern societies treat labor as not just economic, but moral — as something that earns one’s place in the world.
This ethic is so deep in us that we’ll want to dock someone’s pay proportionally if they’re too good at their job. Why should I pay for 4 hours when it only took 1?
And yet we also wouldn’t pay someone who labors badly.
If the man works all day and leaves a shaky fence, we call him incompetent.
If he works too quickly and leaves a perfect one, we call him overpriced.
So what exactly are we paying for?
Not just the outcome. Not just the time. Something more fragile — a narrative. We want the story of effort to match the result.
Now consider a lawyer.
She bills you $400 an hour. She reads your case, drafts a brief, makes a call. She doesn’t sweat or lift or visibly strain — but she spent ten years earning the ability to know exactly what to do.
You’re not paying for her typing. You’re paying for the part where she knows what not to say, what to emphasize, and how to win.
But even here, the same tensions apply.
What if she solves your problem too quickly? What if she reads your case and tells you, in five minutes, “You’re fine. Just don’t sign it.”
You might thank her. But you might also hesitate. Do I really owe $400 for that?
If she had spent three days agonizing over it, you’d feel more certain the money was well-spent.
But the outcome didn’t change — only the optics of the effort.
Now imagine she gets it wrong. She labors, she charges, she delivers — and she loses. You sign something you shouldn't have and you find yourself in expensive litigation or losing thousands of dollars in the resolution of a contract dispute.
Suddenly the effort doesn’t matter at all. You don’t care how hard she worked. You care that she failed.
So again: we don’t just pay for time. Or effort. We also pay for outcome, filtered through judgment, wrapped in a story that feels proportional to the cost.
Now picture a strategist. A consultant. An advisor.
He listens to your situation. He doesn’t ask for binders or spreadsheets. He pauses, nods, and says one thing that instantly changes how you see the problem.
It’s not advice; it’s a lens. And in that moment, everything shifts.
You know it’s valuable. You feel it in your gut. But if someone told you he charged $10,000 for that meeting, your reflex might be to laugh.
“Ten grand? For a sentence?”
But what if that sentence saves your company? What if it prevents a lawsuit? What if it stops you from wasting six months?
Then suddenly, $10,000 sounds like a discount.
The work wasn’t visible. It wasn’t painful. But it was rare.
And rarity, combined with consequence, is often where we locate value — even if it doesn’t feel like “work” in the traditional sense.
This is the world AI now walks into.
It doesn’t strain. It doesn’t bill (much). It doesn’t even pause.
It simply produces instantly, endlessly, without any performative visible effort.
Which brings us to the next question:
If friction disappears… what happens to the value of work?
AI and the collapse of friction
Artificial intelligence doesn’t work like we do. It doesn’t tire. It doesn’t hesitate. It doesn’t second-guess. You give it a prompt, and it gives you a result. No scheduling. No commute. No negotiation. Just output.
It writes, designs, summarizes, predicts. It drafts emails, legal disclaimers, product mockups, birthday invitations, marketing plans, and school essays — not in hours, but in seconds. Not with effort, but with energy. A few trillion electrons rearranged into language.
This is not how work is supposed to look. At least not to us.
For all of human history, every form of work had friction. Not just physical — though that too — but procedural, social, psychological. You had to train. You had to practice. You had to wait your turn, get picked, get promoted, learn the ropes. Even mental work had a kind of ritual to it: the preparation, the pacing, the late nights, the cost.
Friction gave work a kind of moral scaffolding. It helped justify compensation. It gave dignity to effort. It made outcomes feel legitimate — because something had been endured.
Now the endurance is gone.
AI can produce a full slide deck in the time it takes you to refill your coffee. It can summarize a 200-page document faster than you can read the title. It can write a business plan, a marketing email, and a hiring brief — in parallel — without ever needing sleep, encouragement, or clarity. It doesn’t need rest. It doesn’t need meaning. It doesn’t even need to want.
The cost of producing intelligence — or at least the appearance of it — is approaching zero.
And that messes with our heads.
If the work is instant, what’s the value? If there’s no strain, where’s the story? If there’s no labor, why should there be a price?
We used to believe that removing friction was progress. Efficiency was a virtue. Streamlining was a win. But perhaps friction wasn’t just a bug in the system — it was a feature of our values.
Friction signaled effort. Effort signaled legitimacy. And legitimacy supported the idea that someone earned what they were paid.
AI skips all that.
Which is why its output — even when impressive — can feel cheap. It’s not that the answers are wrong. It’s that the answers seem to appear from nowhere, unburdened by the rituals we associate with creation.
We are economically confused, but also something deeper than that:
We are spiritually disoriented.
Because if the machines can do the work — not just faster, but without even trying — then what happens to the meaning we once found in the struggle?
What happens to work when there’s no more weight behind it?
And what happens to us?
Interlude: This time, the incumbents are ready
AI is not a youth movement.
Earlier, we explored how the computer revolution let the young disrupt the old.
A teenager in a hoodie could rewrite the rules of commerce or communication because they understood the machine before their bosses did.
The young weren’t just innovative — they were early. Being early, being smart, and being hungry was enough.
But this time is different.
Artificial intelligence is not being dismissed by the over-35 crowd. It’s being adopted by them.
Executives aren’t scoffing. They’re experimenting.
Corporations aren’t resisting. They’re investing.
Governments are drafting frameworks, not bans (mostly).
Universities are rushing to incorporate, not reject.
We’re not seeing a generational gap like we did with rock music, personal computers, or psychedelics.
The CEOs are in.
The incumbents aren’t asleep. They’re racing.
There is no “parental blind spot” to exploit.
No lag in cultural adoption. AI is not a youth movement. It’s a systems upgrade.
Even in crypto — often presented as the last frontier of decentralization and generational rebellion — we’ve seen the lifecycle collapse into institutional capture.
What began in basements ended in ETFs. The language of rebellion was monetized, and then absorbed.
This matters because it breaks the myth that generational turnover is always disruptive.
The AI moment is not a handoff to the young. It’s a convergence. The people with power are embracing the tools that traditionally displaced them.
Which means that if you’re young, fast, or clever — you may not be competing against inertia like your Gen X and Millennial ancestors did in decades past.
You may be competing against acceleration this time.
So if AI is being adopted at the top (and not bottom-up from underground subcultures), and most of what we considered "work" is being automated, then we must look harder at what, exactly, remains.
The illusion of being strategic
There’s a kind of quiet panic settling over the professional world, but it doesn’t look like panic. It looks like posture. Like assurance. Like executives nodding in glass-walled rooms. Like teachers revising syllabi.
It looks, in other words, like strategy.
But if AI has done anything, it’s exposed the hollowness at the center of our most important decisions. We thought strategy was intention. Foresight. Agency. But often, strategy is just etiquette around complexity — a way of narrating decisions that have already been made by systems too big to see and too slow to interrupt.
The strategist imagines herself upstream. But the stream is automated now.
AI doesn’t just produce answers. It forces questions. And not technical ones. Existential ones. If your job is to synthesize, but the machine synthesizes better; if your role is to analyze, but the machine sees patterns you never would; if your value is “insight,” and insight is now a service — then what, exactly, are you?
It’s not just consultants and managers on the chopping block. It’s a whole architecture of meaning.
The mentor’s wisdom collapses when the mentee can query a million sources in seconds. The CEO’s vision dissolves when the AI forecasts faster. The teacher’s lesson plan flattens when the student can simulate the lesson — and its alternatives — before class. The essayist, the policymaker, the coach, the priest, the founder — all begin to hear their thoughts echoed back before they’ve finished forming them.
What’s being revealed is not that humans can’t think. It’s that much of what we celebrated as “thinking” was compression. Familiarity. Tasteful remix. And those things, it turns out, are highly learnable.
We are entering an era where the appearance of thought is easy. The real kind — the kind that wounds and risks and commits — may become rarer, not more common.
So the illusion isn’t that we were never strategic. The illusion is that strategy was ever common.
It wasn’t.
Most of the time, we were improvising. Naming patterns after we saw them. Projecting coherence onto momentum. But the machines won’t need to pretend. They’ll just compute.
And that leaves us, again, with the real question:
What must remain human, even if it doesn’t need to be?
What deserves to be done slowly?
What deserves to be done by hand?
What deserves to be decided by a person, even when the algorithm is better?
What deserves to be called work?
What Is work?
If labor can be done by machines, and strategy can be exposed as performance, what remains?
What remains is not less important. It might be all that matters.
By 2035, the landscape of work will look radically different. But it will not be empty.
It will be filled with new kinds of human decisions. Not because machines failed to take them over, but because we chose to preserve the burden — and privilege — of choice.
Some specific futures of human work:
- Crisis arbitration. In a world of autonomous systems, the human role will be to adjudicate between machine-driven outcomes with no clear ethical consensus — medical triage, refugee resettlement, planetary climate tradeoffs.
- Narrative design. Someone will need to explain the choices machines make, to shape the story around AI-driven governance, to maintain social cohesion in the face of impersonal optimization.
- Collective meaning work. As more material needs are met frictionlessly, humans will return to ancient questions: What is a good life? What is worth preserving? Schools, communities, and even workplaces will employ people not just to train or manage, but to orient others morally and culturally.
- Synthetic boundary keeping. Humans will be hired to define the edge of acceptability — to say where the machine must not go. In creative fields, this means drawing a line around taste. In policy, it means choosing what not to automate.
- Relational anchoring. Many humans will do jobs that are primarily emotional: companions, listeners, ritual holders, conflict mediators. The job will not be to know, but to be.
- Slow-space stewardship. As most digital work accelerates, a premium will emerge on slowness. Luxury will not be speed, but care. A new artisan class may form around depth: book editors, garden designers, craftspeople, philosophers.
Some things, however, may vanish altogether:
- Junior copywriters. AI will be able to write passable ad copy at scale — faster, cheaper, and on-brand.
- Customer support agents. Natural language models will handle the vast majority of Tier 1 and Tier 2 support across industries.
- Entry-level analysts. Data parsing, trend detection, and reporting will be automated and visualized in real-time.
- Paralegals. Document discovery and basic legal research will be handled more quickly and accurately by machines.
- Executive assistants. Scheduling, travel booking, note-taking, inbox management — all increasingly automated.
- Retail cashiers. Automated checkouts and embedded payment systems will make the role redundant in many locations.
- Delivery dispatchers. Optimized routing and fleet coordination are well-suited for algorithmic decision-making.
- Basic translators. Real-time translation tools will outperform humans in all but high-context or sensitive scenarios.
- Market researchers. Consumer insight gathering will be continuously synthesized from live data streams.
- Online tutors. AI tutors will scale personalized education with minimal human oversight.
- Basic graphic designers. Template-based, prompt-driven visual content creation will dominate the low end of the design stack.
- Middle-tier accountants. Routine tax preparation and financial reconciliation will become largely automated.
These are not predictions of doom. They are recognitions of change.
Some things will stick:
- Barbers. Because the service is local, physical, and trust-based. A machine might do it safely, but people won’t want it.
- Yoga teachers and fitness instructors. Because the value is in in-person instruction, accountability, and human mirroring, not just movement tracking.
- Police officers and firefighters. Because the job requires fast decisions in unpredictable, unstructured environments involving human life.
- Religious leaders. Because authority in faith traditions is built through community participation and shared values, not computation.
- Politicians and MPs. Because democratic legitimacy requires visible human accountability, even if the system is flawed.
- Lawyers and advocates. Because arguing a legal case involves nuance, negotiation, and persuasion, not just rules.
- Nurses and care workers. Because bedside care depends on emotional presence and physical attentiveness, not just protocols.
- Chefs and servers. Because restaurants are social spaces and hospitality involves interpersonal nuance.
- Teachers. Especially in early education, because children learn from modeling behavior and emotional attunement, not just curriculum.
- Midwives and doulas. Because birth support is emotionally intense, physically variable, and requires trust.
- Funeral directors. Because death and grief are handled through cultural rituals that rely on human coordination.
- Artists and performers. Because audiences still value live presence and risk — not just technical proficiency.
- Bartenders. Because service roles in social environments depend on human judgment and interaction.
- Construction workers and skilled tradespeople. Because the variability of physical job sites and edge cases are still too complex for full automation.
These roles will adapt. Some will integrate AI. But they will remain recognizably human.
Work will not vanish. But the word will stretch.
Work will no longer mean the thing that keeps the machine running. It will mean the thing we do to stay human.
So what is work?
Work is choosing wisely. Work is designing systems with consequence in mind.
Work is asking: what deserves to exist?
Work is building the conditions for others to thrive.
Work is the defense of the human against the efficient.
And maybe, most of all—work is what remains sacred when the friction is gone.