general 2026-01-20 · Updated 2026-01-20

The Collapse of the Mechanical Tier

ai-eating-software

Why Tech’s Golden Age Was an Accident — and Why It Isn’t Coming Back

For a brief and historically strange moment, technology enjoyed a privilege almost no other field has ever had. It wasn’t just high-paying. It wasn’t just prestigious. It was procedurally democratic.

You could follow a recipe and reliably convert effort into leverage.

Study the fundamentals. Grind the interview questions. Learn the frameworks. Pass the gates. Repeat.

You didn’t need to be singular. You didn’t need taste, timing, or patronage. You didn’t need to be the next Taylor Swift. You needed compliance with a system that rewarded mechanical competence at scale.

That was the bargain. And for a few decades, it held.

AI didn’t break that bargain by being creative. It broke it by collapsing the mechanical bottleneck that made the bargain possible in the first place.


I. What Tech’s Special Privilege Really Was

Tech is often described as a creative industry. That framing is incomplete, but not entirely wrong.

Software work has always had a creative component. Designing systems, choosing abstractions, and navigating tradeoffs require judgment and imagination. But over time, especially in the last decade, the center of gravity shifted. Creativity became front‑loaded and rare, while day‑to‑day labor became increasingly mechanical and repeatable.

The dominant direction of the industry was not re‑imagination, but reuse. In practice this meant buying instead of building, composing existing systems instead of inventing new ones, and gluing components together rather than redesigning foundations.

It was how incentives trained people to work. Performance systems rewarded visible impact, leverage over originality, and simplification over invention. If you are honest, most review cycles didn’t ask what you reimagined; they asked what you reused, unblocked, or shipped.

As a result, tech’s real advantage became mechanical leverage combined with broad participation. Software let a single person produce output that scaled infinitely, but just as importantly, it let millions of people participate without needing to be exceptional. Competence was enough.

In music, illustration, writing, or performance, effort does not map cleanly to outcomes. You can do everything right and still disappear. In tech, effort mapped to employability with unusually low variance, precisely because so much of the work had been standardized.

Tech created something rare: a fat, stable middle. A world where you could be a session musician and still pay rent. Where you didn’t need genius, just reliability. AWS‑level reliability, applied to human labor.

Premium pay wasn’t a reward for pure creativity. It was rent on scarcity created by reliable mechanical execution at scale.


II. Why Mechanical Skill Used to Protect You

Historically, intent had to pass through embodied skill.

If you wanted to act at scale, you had to earn the right through repetition, training, and constraint. Mastery was not optional and it was not compressible.

You could not become a master swordsmith or a competent swordsman overnight, no matter how strong your intent was. Years of embodied practice were the price of admission. The same was true in software. You could not simply decide to be a top systems programmer. Writing C++ well enough to matter required years of exposure to failure modes, edge cases, and invisible constraints that only surfaced with time.

That delay mattered. It limited who could act, how fast they could act, and how much damage or value they could produce before institutions noticed.

This coupling did more than create productivity. It throttled power. It slowed harm. It created wage floors because skill was a rate limiter. Not everyone could execute, so those who could were protected by time, friction, and replacement cost.

Skill didn’t just increase output; it slowed replacement, anchored wages, and introduced enough friction to keep labor markets from repricing overnight.


III. AI as a Decoupling Event

AI severs the link between intent and execution.

You no longer need to embody the skill. You describe the outcome. The system fills in the mechanics.

This is not unprecedented. It is exactly what happened in warfare when guidance systems replaced muscle and training. Once you no longer need to swing the sword a thousand times, initiative and targeting dominate. In theory, anyone can be lethal. In practice, this doesn’t mean permanent chaos; it means power equalizes until institutions, norms, and controls reassert themselves. Skill stops being the limiter, and organization takes over.

Software just crossed the same threshold.

The moment intent bypasses skill, the bottleneck moves away from individuals and toward organizations, authority structures, political gatekeepers, and, increasingly, sheer serendipity.


IV. Why Premium Pay Evaporates When Mechanics Evaporate

Markets rarely reward effort or difficulty directly; they reward scarcity, and only for as long as that scarcity persists.

Once execution becomes cheap and abundant, it stops commanding rent. Even if the quality is lower, cost overwhelms the delta. AI does not need to outperform experts. It only needs to be cheaper than the median.

This creates an asymmetric injury. Skills that take years to acquire lose demand in years, sometimes months. The work still matters. The market simply no longer values the labor pathway that produced it.

What disappears first is not exceptional work, but the wide middle tier that once made competence economically viable.


V. This Has Happened Before

This is not the first time an artisan class has collapsed under industrialization.

Metalworkers and handloom weavers learned this in the 19th century. Samurai and swordsmiths learned it in Meiji Japan. Coachbuilders learned it when the assembly line arrived.

In each case, there was a recognizable halfway point, but it is easy to misread what that point actually represents.

The halfway point was not primarily about buyers changing their taste or suddenly becoming indifferent to craft. It was about labor participation. As tools became good enough to encode, standardize, and reproduce skill at scale, large numbers of people who had once been required to produce the work were no longer needed.

Only after that contraction did buyers stop caring how the thing was made. Commoditization followed the collapse of skilled labor participation, not the other way around. Society changes, but human capacity to absorb disruption does not. That constraint is remarkably stable, which is why past transitions are still useful guides for the speed and shape of collapse.

After that point, demand destruction accelerated faster than retraining could occur.

For software, that halfway point arrived around 2022–2023.

We are now two to three years past peak labor leverage, even if social narratives haven’t caught up.


VI. The Acceleration Curve

A consequence of network‑speed industrialization is that labor half‑life collapses.

In prior transitions, artisan classes didn’t decline gradually. Once the halfway point was crossed, employment followed a predictable curve: a sharp drop to roughly half its previous size within a single decade, followed by long stagnation.

Handloom weavers lost the majority of their employment between 1800 and 1815. Samurai and sword‑adjacent trades collapsed within one generation after Meiji centralization. Coachbuilders were largely gone by the mid‑1920s, less than twenty years after the assembly line.

Software shows the same early signal, compressed.

Peak tech employment occurred around 2021–2022. Since then, output has risen while headcount has fallen. Productivity gains are not translating into rehiring because execution no longer bottlenecks growth.

If the historical pattern holds, within ten years tech will employ roughly half the number of people it once did, even as software continues to expand its reach across the economy.

This is not a collapse of software. It is a collapse of software labor.

What disappears is not code, but the broad, repeatable employment class built around producing it.


Every previous industrial collapse had friction. Machines had to be built. Factories had to be installed. Capital had to accumulate. Regions lagged.

Software has none of that friction. Distribution of software tools is effectively instantaneous, closer to an update than an installation. AI capabilities propagate globally the moment they ship, without waiting for factories, supply chains, or local expertise. There is no local buffer and no geographic lag.

This is why the collapse compresses so violently. Human institutions still move at human speed, but the tools do not. What once unfolded over decades now plays out in years, and what took years now takes months.


VII. “Taste, Value, and Judgment” Is the Right Answer to the Wrong Question

A common response to automation anxiety is to invoke Jevons’ paradox: when something becomes cheaper, demand expands, and total labor may even increase. This is often offered as reassurance.

Historically, this argument is incomplete. Jevons’ paradox describes system‑level consumption, not individual livelihood. Demand may grow, but participation does not remain evenly distributed. The additional value accrues to owners, coordinators, and capital, not to the displaced artisan class.

In every previous transition, productivity gains eventually created abundance—but only after a long and uneven transition. The people who paid for that transition were the workers whose skills were devalued faster than new roles appeared. Knowing that things may rebalance in twenty years is cold comfort if you cannot outlive the gap. As Keynes put it in a different context, the market can remain irrational longer than you can remain solvent.

This is sometimes described as the Engels Gap: the period where productive capacity races ahead, but wages, roles, and social structures lag behind. History suggests that gap is measured in decades, not quarters.

Appeals to taste, judgment, and creativity function similarly. They may be true in the long run, but they do not preserve broad participation in the meantime. Attention is finite, legitimacy is scarce, and markets for judgment are narrow. Even without AI, the attention economy was already fraying. AI accelerates that fragmentation rather than resolving it.

This isn’t a case of the ladder shifting upward; the structure that made the ladder usable for most people is no longer there. The right question is no longer how to climb it, but what real choices remain, and what each one costs in terms of agency, safety, and upside.


VIII. Judgment Without Leverage (The Named Human Trap)

After execution collapses, institutions still need humans.

Someone must sign off. Someone must absorb liability. Someone must reassure stakeholders that a person is “in the loop.”

This produces a structurally fragile role: the named human.

Highly trusted. Highly visible. Temporarily well paid.

But no longer in control of production.

The logic is clean:

Execution is cheap. Risk must sit somewhere. Risk is assigned to judgment. Judgment is assigned to a person.

The syllogism works on paper, which is precisely why it is so hard to argue against inside institutions. But its outcome is perverse. It produces roles that are highly trusted yet weakly empowered, where responsibility accumulates faster than control. What looks like preservation of human judgment is, in practice, a concentration of liability without leverage.

You are no longer the cook. You are the taster.

Prestige without control is not power. Trust is not a moat when replacement cost trends toward zero. When you stop producing and start validating, your skill stops compounding and starts depreciating with institutional patience.


IX. Where the New Bottlenecks Are

When mechanical bottlenecks collapse, value migrates to narrower choke points. These are not new roles so much as new filters:

Authorization, meaning who is allowed to deploy systems at scale. Selection, meaning what gets built and what gets ignored. Coordination, meaning turning cheap execution into outcomes. Legitimacy, meaning who people believe when systems fail.

These bottlenecks are real, but they are narrow. Fewer people pass through them than before, and fewer still stay there for long.


X. The Paths Forward

After every artisan collapse, the options converge, and history is fairly consistent about what those options look like.

Some move to the floor. Trades, nursing, physical systems. Lower upside, higher predictability. In the 19th century, displaced weavers and metalworkers moved into factory labor, agriculture, or maintenance work that remained tied to physical presence. After Meiji centralization, many former samurai became bureaucrats, teachers, or police officers—roles with lower status but stable income.

Some attempt to be the last expert standing inside the old system. The Highlander path. High variance, eroding margins, constant threat.

This is not the same as becoming a singular artisan under patronage. This path stays inside the legacy contract: the same institutions, the same incentives, and the same assumptions about value—just with fewer peers.

Historically, this shows up as specialists whose skills remain necessary but no longer expanding. COBOL programmers maintaining decades‑old financial systems are a modern example, as are engineers responsible for long‑lived satellites and spacecraft that cannot be easily replaced or rewritten. Their knowledge is real and scarce, but it is boxed in. Bargaining power shrinks as work becomes maintenance rather than growth, as younger labor can be trained “just enough,” and as institutions systematically invest in replacement strategies.

Survival here does not imply leverage. It often means slower decline, higher stress, and dependence on systems that are actively trying to make you unnecessary.

Some survive as singular artisans under patronage. Bespoke, identity-driven, and rare. Swordsmiths in Japan, coachbuilders serving luxury clients, and later haute couture ateliers all persisted, but only by becoming culturally symbolic rather than economically central. These paths favored reputation, timing, and sponsorship over scale.

What disappears is the ability to maximize agency, safety, and upside simultaneously.

Tech once cheated this triangle. AI removes the cheat code.


XI. Conclusion

The usual advice in moments like this is to learn new tools, reskill, or wait for the cycle to turn. That advice sounds practical, but it is largely dishonest.

Reskilling assumes the bottleneck will return to individuals. Learning AI assumes the system will again reward broad participation. Neither assumption holds. The bottleneck has already moved, and it is not moving back.

The honest questions during periods of upheaval are harder and less comforting. What outcomes are actually realistic? Where have the bottlenecks moved? What is still under individual control, and what is not? What is the least bad outcome available, and what does it cost?

Yes, taste, value, and judgment can still work—if you are prepared to become a singularity. That path trades safety for upside and accepts winner‑take‑most dynamics. Staying inside the system can also work, if you are willing to run Red Queen races indefinitely, competing harder for shrinking leverage and living with constant replacement pressure.

If neither of those is acceptable, it is reasonable to look elsewhere. Trades, nursing, and other forms of physical or care‑based work do not offer prestige, but they offer floors. They remain locally grounded, harder to compress, and tied to presence rather than abstraction.

Hoping this is temporary is unrealistic. Software expanded so quickly because it rode on global integration. That world is fragmenting. Trade is fragmenting. Identity is fragmenting. Even without AI, tech employment was already shrinking. AI did not start the fire; it sealed the exits.

This is not a story about individual effort, merit, or personal failure. It is about how bottlenecks shift, how leverage concentrates, and how power reorganizes when execution becomes cheap. Once you see that, the question stops being how to win the old game, and becomes which game you are willing to play, and what you are willing to give up to do so.

Things work out eventually. Just not for most of the people who paid for the transition.