Enterprise promised its engineers the world. It gave them soma instead. Now tomorrow has arrived and the bench is thinner than anyone wants to admit.

The Soma

In Aldous Huxley's Brave New World, the dystopia isn't brutal. Nobody is tortured into compliance. The cage is invisible because it is comfortable. Citizens are kept content, productive, and incurious by soma — a happiness drug that smooths every edge, dissolves every difficult question, and makes the status quo feel like freedom.

The enterprise technology industry ran the same experiment for roughly two decades, and it worked beautifully.

Free lunches. Ping pong tables. Flat hierarchies that felt like liberation. The sprint was full, the retro was on Friday, and the salary cleared every month. Nobody was asking hard questions because nobody needed to. The soma was excellent.

But soma was never investment. It was retention dressed as culture. Keep the bodies comfortable enough to stay, extract the output, and defer every difficult question about whether anyone was actually growing into the engineers tomorrow would need. The industry didn't develop its people. It pacified them. And pacified people don't develop the kind of thinking that takes years of genuine difficulty to form.

Midichlorians

In the Star Wars prequel trilogy, midichlorians are the innate biological markers of Force sensitivity. You either have them or you don't. Training refines what's there. It doesn't create it from nothing. The Jedi Order spent centuries developing elaborate systems to find Force-sensitive children precisely because the trait was rare, irreplaceable, and couldn't be manufactured on demand.

There is an equivalent trait in software engineering. It is architectural instinct — the ability to hold a complex problem in the mind, rotate it, and see where it dissolves rather than where it needs to be handled. Not managed. Not abstracted. Dissolved. The constraint removed so the problem class simply cannot arise.

Most engineers encounter a complex problem and build. They design structures around it, patterns to contain it, abstractions to manage it. This is learnable. Valuable. The entire Design Patterns canon is a catalogue of this approach — here are the shapes that contain known problems.

The rare engineer operates at the level above. They see that the problem exists because of a prior decision, and that undoing that decision removes it entirely. The output isn't a handler. The output is an absence.

The Interview Problem

The candidate who dissolves the problem routinely scores lower than the one who presents a thorough architecture for managing it — because the interviewer can evaluate the latter and cannot parse the former. The elegant solution looks underworked. The interviewer marks it down and moves on, satisfied they caught a gap. They didn't catch a gap. They missed a Jedi.

Sit in enough university lectures and you see it clearly. In any cohort of a hundred computer science students there might be one or two who seem to think in systems before they've been taught to. They understand the structure before they've built it. They use that understanding to build less. The teaching sharpens what's there. It does not install it in people for whom it isn't.

The industry has never built a lightsaber to detect it. It built whiteboard interviews instead.

The Machine That Selected Against Itself

Here is the indictment. Every institutional mechanism enterprise technology constructed in the velocity era ran systematically, consistently, and in some cases almost perfectly backwards against this trait.

Hiring frameworks measured what candidates could produce, not what they recognised as unnecessary. Algorithmic challenges, system design exercises, code reviews under pressure — all legitimate proxies for legitimate skills. None of them detect architectural instinct. None of them could. You cannot whiteboard the ability to see what doesn't need to exist. The output of that thinking is absence, and absence doesn't fit on a whiteboard.

Performance frameworks rewarded visible, legible, assessable output. Complexity generates artefacts — diagrams, runbooks, tickets, pull requests, architecture decision records, a Confluence page nobody will ever read. It looks like serious engineering. Problem elimination generates none of this. The system is just smaller than expected. The failure mode just doesn't arise. On a performance framework designed to capture demonstrable contribution, this scores poorly or doesn't score at all.

Promotion decisions compounded this over years. The people who reached staff and principal level were predominantly optimised for the evaluation system rather than the actual work. Once that cohort reached seniority they defined the culture, ran the interviews, wrote the job specifications. The corruption propagated structurally. The filter selected for its own continuation and called it a meritocracy.

Conference culture monetised adoption not outcomes. The thought leader circuit rewarded the talk that introduced a pattern, not the decade-later retrospective that honestly assessed the damage from organisations that adopted it without prerequisite capability. Nuance doesn't sell keynotes. The caveats got stripped because the machine that strips caveats pays better than the machine that keeps them.

Microservices: The Proof of Concept

This already happened. We have the receipts.

The microservices pattern originated with engineers who genuinely possessed architectural instinct at organisations — Amazon, Netflix — where the scale legitimately justified the complexity. The early thought leaders were careful. Martin Fowler's 2014 essay is explicit about the prerequisites. Sam Newman's Building Microservices carries the same caveats throughout: this assumes organisational maturity, strong DevOps culture, teams capable of owning services end to end. The warning was in the original material. It was stated clearly by the people who knew what they were talking about.

It got stripped on the way to mainstream adoption because it was commercially inconvenient. Consultancies needed to sell the transformation regardless of whether the organisational substrate could support it. The caveat — this requires higher calibre engineers than most organisations currently have — was the one thing nobody in a position to profit from adoption wanted to say out loud.

So an entire industry convinced itself it had the calibre. It built distributed systems it couldn't reason about. It decomposed CRUD applications into seventeen services with a message bus and called it architecture. It spent a decade firefighting the consequences.

The industry is now quietly rehabilitating the monolith. Amazon Prime Video published a widely read case study in 2023 describing their migration back. The collective acknowledgment is happening, carefully worded to distribute blame onto the pattern rather than onto the institutions and hiring frameworks that placed the wrong people in architectural decision-making roles for a decade.

The Distinction That Matters

The pattern didn't fail. The trait was missing. Those are different statements. The industry has only managed to make the first one out loud.

That is your proof of concept. The missing variable was architectural instinct. The outcome without it was visible, expensive, industry-wide, and it ran at human speed over roughly ten years.

Nitrogen

Artificial intelligence is not a quality multiplier. It is an output multiplier.

This distinction is the thing most enterprise AI strategy carefully avoids because the entire commercial logic of AI adoption depends on conflating them. The assumption embedded in almost every roadmap is that AI accelerates good work. It does. It accelerates everything else in exactly equal measure. It accelerates mediocre architectural decisions made at velocity. It accelerates cargo-culted patterns applied without prerequisite understanding. It accelerates the instinct to build when the correct move was to dissolve.

The slop machine was already running. It produced indistinguishable enterprise Java, over-engineered React frontends, Kubernetes clusters for applications that needed a single binary. All of it produced by engineers who passed the interviews, received the promotions, attended the conferences. Institutional, credentialed, thoroughly documented slop.

The mediocre engineer who spent a decade producing mediocre output at human speed now produces it at AI speed. The volume increases by an order of magnitude. The output still compiles. It has tests. It has a README. It has 90% coverage and a CI pipeline. The slop is now coherent, thorough, well-structured, and generated faster than any human can review it critically. You cannot easily tell it from the real thing until you're three years into maintaining it.

The systems that result will be unmaintainable at a scale previously impossible, because the architectural thinking that should precede code generation was never there. You cannot prompt your way to problem elimination. AI executes direction. If the direction comes from someone who instinctively reaches for complexity — who has spent a career being rewarded for it — you get complex systems generated at a velocity that makes the microservices decade look like a cautionary sketch.

The Jedi Order couldn't scale in the human era. The question now is what happens when every Stormtrooper has a lightsaber.

The Reckoning

The small AI-assisted team is the coming organisational unit. Five people with capable AI tooling can operate at the output level of an engineering department that previously required fifty. This is demonstrably true in organisations willing to be honest about it.

That team has zero tolerance for the architectural debt the velocity era accumulated. Every unnecessary abstraction, every problem managed rather than dissolved, every decision made for career visibility rather than functional necessity — all of it becomes load-bearing complexity that five people cannot service. Large teams had slack. Someone could own the Kafka cluster nobody should have built. A small team cannot absorb that. The fat is gone. Only the skeleton shows.

The codebase has to be genuinely honest because there aren't enough humans to hold complexity in collective memory. The architecture has to be clean because AI tooling navigates clear systems well and compounds confusion in opaque ones. The problem elimination instinct isn't a preference. It is a survival requirement for this operating model.

Which means the trait — rare, under-assessed, actively penalised by the frameworks enterprise built to evaluate engineers — has become the primary determinant of whether an organisation can field the unit the next decade demands.

The bench is thin. It was always going to be thin. Less of the industry is positioned to take advantage of this moment than anyone wants to say out loud. The organisations that understand this have an asymmetric advantage they mostly don't know they're sitting on. The ones that don't are about to find out, at velocity, what a decade of institutional selection against architectural instinct actually cost them.

The Savage's Choice

At the end of Brave New World, John the Savage rejects the World State. He chooses difficulty, authenticity, the right to struggle. The Controller finds this incomprehensible. The soma is free. Why would anyone choose otherwise?

The engineers who saw the velocity cult clearly — who knew the microservices cluster was a mistake being made in slow motion, who watched architectural decisions get made by people who shouldn't have been making them, who scored poorly in interviews because their solutions were too small to look serious — made the Savage's choice, eventually, most of them. They went independent. They built outside the machine. Not because independence is romantic, but because the alternative was continuing to be evaluated by a system designed to miss them.

That choice looks different now than it did five years ago. The organisations that need architectural instinct most urgently are not going to find it through the same hiring frameworks that spent twenty years filtering it out. They are going to find it where it went when the soma wasn't enough.

References

  1. Martin Fowler, Microservices (2014). martinfowler.com
  2. Sam Newman, Building Microservices (O'Reilly, 2015; 2nd ed. 2021).
  3. Amazon Prime Video, Scaling up the Prime Video audio/video monitoring service (2023).
  4. Frederick P. Brooks Jr., No Silver Bullet — Essence and Accident in Software Engineering (1987).
  5. Rich Hickey, Simple Made Easy (Strange Loop, 2011). infoq.com
  6. Gamma, Helm, Johnson, Vlissides, Design Patterns: Elements of Reusable Object-Oriented Software (Addison-Wesley, 1994).
  7. Aldous Huxley, Brave New World (Chatto & Windus, 1932).